Data Engineer

at Capgemini

Agency job
via Nu-Pie
icon
Remote, Bengaluru (Bangalore)
icon
4 - 8 yrs
icon
₹4L - ₹16L / yr (ESOP available)
icon
Full time
Skills
Big Data
Hadoop
Data engineering
data engineer
Google Cloud Platform (GCP)
Data Warehouse (DWH)
ETL
Systems Development Life Cycle (SDLC)
Java
Scala
Python
SQL
Scripting
Teradata
HiveQL
Pig
Spark
Apache Kafka
Windows Azure
Job Description
Job Title: Data Engineer
Tech Job Family: DACI
• Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field)
• 2 years of experience in Data, BI or Platform Engineering, Data Warehousing/ETL, or Software Engineering
• 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
Preferred Qualifications:
• Master's Degree in Computer Science, CIS, or related field
• 2 years of IT experience developing and implementing business systems within an organization
• 4 years of experience working with defect or incident tracking software
• 4 years of experience with technical documentation in a software development environment
• 2 years of experience working with an IT Infrastructure Library (ITIL) framework
• 2 years of experience leading teams, with or without direct reports
• Experience with application and integration middleware
• Experience with database technologies
Data Engineering
• 2 years of experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role)
• Expertise in Java/Scala/Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering role)
BI Engineering
• Expertise in MicroStrategy/Power BI/SQL, Scripting, Teradata or equivalent RDBMS, Hadoop (OLAP on Hadoop), Dashboard development, Mobile development (specific to the BI Engineering role)
Platform Engineering
• 2 years of experience in Hadoop, NO-SQL, RDBMS or any Cloud Bigdata components, Teradata, MicroStrategy (specific to the Platform Engineering role)
• Expertise in Python, SQL, Scripting, Teradata, Hadoop utilities like Sqoop, Hive, Pig, Map Reduce, Spark, Ambari, Ranger, Kafka or equivalent Cloud Bigdata components (specific to the Platform Engineering role)
Lowe’s is an equal opportunity employer and administers all personnel practices without regard to race, color, religion, sex, age, national origin, disability, sexual orientation, gender identity or expression, marital status, veteran status, genetics or any other category protected under applicable law.
Read more

About Capgemini

With more than 180,000 people in over 40 countries, Capgemini is one of the world's foremost providers of consulting, technology and outsourcing services.
Read more
Founded
1967
Type
Services
Size
100-1000 employees
Stage
Profitable
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Data Analyst

at GroupM

Founded  •   •  employees  • 
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
Data Analytics
Google Analytics
Adobe Analytics
Pivot table
SQL
Python
Big Data
icon
Bengaluru (Bangalore), Gurugram, Mumbai
icon
4 - 6 yrs
icon
₹12L - ₹16L / yr
Data & Technology -This function is an Analytics, Technology, and consulting group supporting the buying & campaign delivery teams. We combine Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people. We are currently looking for an Analytics Manager to join the GroupM Services team. 

Description

Overview

Data & Technology -This function is an Analytics, Technology, and consulting group supporting the buying & campaign delivery teams. We combine Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people.

This role is a fantastic opportunity for personal and professional growth and to contribute to a high-performance team, focused on continuous learning, rigorous best practice and achieving high levels of customer service. The role requires a top-class candidate with excellent numeracy and proven analytics problem-solving skills to join our high energy, entrepreneurial team.

 

Reporting of the role

This role reports to the Analytics Director.

3 best things about the job:

  1. Be a member of a high performing team focused on technology, data, partners and platforms, a key strategic growth area for GroupM and WPP.
  2. Work in an environment that promotes freedom, flexibility, empowerment, and diverse working styles to solve real business problems.
  3. The opportunity to learn & collaborate with a wide range of stakeholders across all GroupM agencies & business units.
  4.  

Measures of success –

In three months:

  • Gain an in depth understanding of the media landscape, be trained on the various media buying platforms specifically, data & analytics databases and tools and understand how GMS business operates
  • Lead and roll out various analytics and attribution frameworks and best practices for campaign measurement
  • Develop proficiency in clean room analytics such as ADH, Infosum, Liveramp etc.
  • Develop relationships and earn trust with your own team

 

In six months:

  • Working with the campaign delivery teams to deliver high value, in-depth analytics, and attribution including client site analytics, channel analytics, automated where possible. Part of this will be to ensure that prior to the campaign all tracking and assets are in place as required by the briefing, then monitoring throughout the campaign that data is being collected.
  • Help develop standard and where possible automated advanced clean room analytics solutions that can be scaled across all agencies.
  • Perform active stakeholder management to continue to evolve these analytics solutions as per the priority requirements. 

In twelve months:            

  • Work with the APAC GMS teams to ensure the local and regional data analytics solutions are aligned and local needs are strongly represented at the regional / global level
  • Develop proficiency in measurement frameworks in a post cookie era, leading experiments for measuring campaign delivery, brand health and marketing effectives / ROI.
  • Be an expert in data and lead bespoke insight analytics work as the demand and function continues to grow – i.e. answering complex business problems posed by our clients, providing thought leadership in defining measurement strategies, etc.

 

Responsibilities of the role:

  • Provide digital campaign analytics – including campaign delivery, measurement, and attribution
  • Client site analytics – e.g., Google Analytics, Adobe Analytics
  • Client channel analytics – e.g., social listening, ecommerce – shopalyst, pre-post purchase analytics, pricing benchmarks
  • Create omni(digital)-channel measurement strategies for performance reporting
  • Deploy data-driven attribution models to support campaign optimisation
  • Develop and roll out frameworks around various attribution models
  • Create a leading analytics solution suite leveraging media / neutral data clean rooms
  • Foster a community of data analytics practitioners for knowledge sharing and growing expertise

What you will need:

  • Min 4 –5 years’ experience working within an analytical role
  • Prior experience within a digital media role is highly desirable, particularly search, social and programmatic
  • A degree in a quantitative field (e.g. economics, computer science, mathematics, statistics, engineering, physics, etc.)
  • Proficiency in Excel (including but not limited to VLOOKUP’s, arrays, pivot tables, conditional and nested formulas, VBA/macros)
  • Experience with SQL/ Big Query/GMP tech stack / Clean rooms such as ADH
  • Hands-on experience on BI/Visual Analytics Tools like PowerBI or Tableau
  • Knowledge or hands-on experience on analytics platforms like Google Analytics, Data Studio, Adobe Analytics, MMP such as Firebase, Appsflyer, Kochava etc.
  • Evidence of technical comfort and good understanding of internet functionality desirable
  • Analytical pedigree - evidence of having approached problems from a mathematical perspective and working through to a solution in a logical way
  • Proactive and results-oriented
  • A positive, can-do attitude with a thirst to continually learn new things
  • An ability to work independently and collaboratively with a wide range of teams
  • Excellent communication skills, both written and oral
  • An interest in media, advertising and marketing

 

More about GroupM

GroupM - GroupM leads and shapes media markets by delivering performance enhancing media products and services, powered by data and technology. Our global network agencies and businesses enable our people to work collaboratively across borders with the best in class, providing them the opportunity to accelerate their progress and development. We are not limited by teams or geographies; our scale and diverse range of clients lets us be more adventurous with our business and talent. We give our talent the space, support and tools to innovate and grow.

Discover more about GroupM at www.groupm.com
Follow @GroupMAPAC on Twitter
Follow GroupM on LinkedIn - https://www.linkedin.com/company/groupm


2020 brought opportunities for brands to innovate because of which we saw an evolving media stack. The growth of digital is set to soar high because of changing consumer habits. With approximately 500 million smartphone users, low-priced data plans, 45 to 50 million e-commerce shoppers, approximately 60 OTT offerings and a young population, India is a mobile-first internet market. It is also one of the top 10 ad spend markets in the world and is set to climb the ranks. Global big tech corporations have made considerable investments in top e-commerce/retail ventures and Indian start-ups, blurring the lines between social media, e-commerce and mobile payments, resulting in disruption on an unimaginable scale.

At GroupM India, there’s never a dull moment between juggling client requests, managing vendor partners and having fun with your team. We believe in tackling challenges head-on and getting things done.

GroupM is an equal opportunity employer. We view everyone as an individual and we understand that inclusion is more than just diversity – it’s about belonging. We celebrate the fact that everyone is unique and that’s what makes us so good at what we do. We pride ourselves on being a company that embraces difference and truly represents the global clients we work with.

 
Read more
Job posted by
Surabhi Deo

Data Engineer

at Fintech Company

Agency job
via Jobdost
Python
SQL
Data Warehouse (DWH)
Hadoop
Amazon Web Services (AWS)
DevOps
Git
Selenium
Informatica
ETL
Big Data
Postman
icon
Bengaluru (Bangalore)
icon
2 - 4 yrs
icon
₹7L - ₹12L / yr

Purpose of Job:

Responsible for drawing insights from many sources of data to answer important business
questions and help the organization make better use of data in their daily activities.


Job Responsibilities:

We are looking for a smart and experienced Data Engineer 1 who can work with a senior
manager to
⮚ Build DevOps solutions and CICD pipelines for code deployment
⮚ Build unit test cases for APIs and Code in Python
⮚ Manage AWS resources including EC2, RDS, Cloud Watch, Amazon Aurora etc.
⮚ Build and deliver high quality data architecture and pipelines to support business
and reporting needs
⮚ Deliver on data architecture projects and implementation of next generation BI
solutions
⮚ Interface with other teams to extract, transform, and load data from a wide variety
of data sources
Qualifications:
Education: MS/MTech/Btech graduates or equivalent with focus on data science and
quantitative fields (CS, Eng, Math, Eco)
Work Experience: Proven 1+ years of experience in data mining (SQL, ETL, data
warehouse, etc.) and using SQL databases

 

Skills
Technical Skills
⮚ Proficient in Python and SQL. Familiarity with statistics or analytical techniques
⮚ Data Warehousing Experience with Big Data Technologies (Hadoop, Hive,
Hbase, Pig, Spark, etc.)
⮚ Working knowledge of tools and utilities - AWS, DevOps with Git, Selenium,
Postman, Airflow, PySpark
Soft Skills
⮚ Deep Curiosity and Humility
⮚ Excellent storyteller and communicator
⮚ Design Thinking

Read more
Job posted by
Sathish Kumar

Data Engineer

at Global data analytics and data engineering firm that partner

Agency job
via Startup Login
Big Data
SQL
Amazon Web Services (AWS)
Python
Hadoop
ETL
Agile/Scrum
icon
Bengaluru (Bangalore)
icon
2 - 8 yrs
icon
₹10L - ₹20L / yr
The responsibilities are detailed as below:

• Experience in understanding and translating data, analytic requirements and functional
needs into technical requirements while working with global customers
• Build and maintain data pipelines to support large scale data management in alignment with
data strategy and data processing standards
• Experience in Database programming using multiple flavor of SQL
• Deploy scalable data pipelines for analytical needs
• Experience in Big Data ecosystem - on-prem (Hortonworks/MapR) or Cloud
(Dataproc/EMR/HDInsight)
• Worked on query languages/tools such as Hadoop, Pig, SQL, Hive, Sqoop and SparkSQL.
• Experience in any orchestration tool such as Airflow/Oozie for scheduling pipelines
• Exposure to latest cloud ETL tools such as Glue/ADF/Dataflow
• Understand and execute IN memory distributed computing frameworks like Spark (and/or
DataBricks) and its parameter tuning, writing optimized queries in Spark
• Hands-on experience in using Spark Streaming, Kafka and Hbase
• Experience working in an Agile/Scrum development process
Read more
Job posted by
Sanjay Kiran

Artificial Intelligence (AI) Researchers and Developers

at Meslova Systems Pvt Ltd

Founded 2017  •  Products & Services  •  20-100 employees  •  Bootstrapped
Artificial Intelligence (AI)
Machine Learning (ML)
Python
Agile/Scrum
icon
Hyderabad, Bengaluru (Bangalore), Delhi
icon
2 - 5 yrs
icon
₹3L - ₹8L / yr
Job Description

Artificial Intelligence (AI) Researchers and Developers

Successful candidate will be part of highly productive teams working on implementing core AI algorithms, Cryptography libraries, AI enabled products and intelligent 3D interface. Candidates will work on cutting edge products and technologies in highly challenging domains and will need to have highest level of commitment and interest to learn new technologies and domain specific subject matter very quickly. Successful completion of projects will require travel and working in remote locations with customers for extended periods

Education Qualification: Bachelor, Master or PhD degree in Computer Science, Mathematics, Electronics, Information Systems from a reputed university and/or equivalent Knowledge and Skills

Location : Hyderabad, Bengaluru, Delhi, Client Location (as needed)

Skillset and Expertise
• Strong software development experience using Python
• Strong background in mathematical, numerical and scientific computing using Python.
• Knowledge in Artificial Intelligence/Machine learning
• Experience working with SCRUM software development methodology
• Strong experience with implementing Web services, Web clients and JSON protocol is required
• Experience with Python Meta programming
• Strong analytical and problem-solving skills
• Design, develop and debug enterprise grade software products and systems
• Software systems testing methodology, including writing and execution of test plans, debugging, and testing scripts and tools
• Excellent written and verbal communication skills; Proficiency in English. Verbal communication in Hindi and other local
Indian languages
• Ability to effectively communicate product design, functionality and status to management, customers and other stakeholders
• Highest level of integrity and work ethic

Frameworks
1. Scikit-learn
2. Tensorflow
3. Keras
4. OpenCV
5. Django
6. CUDA
7. Apache Kafka

Mathematics
1. Advanced Calculus
2. Numerical Analysis
3. Complex Function Theory
4. Probability

Concepts (One or more of the below)
1. OpenGL based 3D programming
2. Cryptography
3. Artificial Intelligence (AI) Algorithms a) Statistical modelling b.) DNN c. RNN d. LSTM e.GAN f. CN
Read more
Job posted by
Sri Hari Nandan

Data Analyst

at Extramarks Education India Pvt Ltd

Founded 2007  •  Product  •  1000-5000 employees  •  Profitable
Tableau
PowerBI
Data Analytics
SQL
Python
icon
Noida, Delhi, Gurugram, Ghaziabad, Faridabad
icon
3 - 5 yrs
icon
₹8L - ₹10L / yr

Required Experience

· 3+ years of relevant technical experience as a data analyst role

· Intermediate / expert skills with SQL and basic statistics

· Experience in Advance SQL

· Python programming- Added advantage

· Strong problem solving and structuring skills

· Automation in connecting various sources to the data and representing it through various dashboards

· Excellent with Numbers and communicate data points through various reports/templates

· Ability to communicate effectively internally and outside Data Analytics team

· Proactively take up work responsibilities and take adhocs as and when needed

· Ability and desire to take ownership of and initiative for analysis; from requirements clarification to deliverable

· Strong technical communication skills; both written and verbal

· Ability to understand and articulate the "big picture" and simplify complex ideas

· Ability to identify and learn applicable new techniques independently as needed

· Must have worked with various Databases (Relational and Non-Relational) and ETL processes

· Must have experience in handling large volume and data and adhere to optimization and performance standards

· Should have the ability to analyse and provide relationship views of the data from different angles

· Must have excellent Communication skills (written and oral).

· Knowing Data Science is an added advantage

Required Skills

MYSQL, Advanced Excel, Tableau, Reporting and dashboards, MS office, VBA, Analytical skills

Preferred Experience

· Strong understanding of relational database MY SQL etc.

· Prior experience working remotely full-time

· Prior Experience working in Advance SQL

· Experience with one or more BI tools, such as Superset, Tableau etc.

· High level of logical and mathematical ability in Problem Solving

Read more
Job posted by
Prachi Sharma

Tableau Engineer

at Aideo Technologies

Founded 2009  •  Product  •  100-500 employees  •  Bootstrapped
Tableau
Natural Language Processing (NLP)
Computer Vision
Python
RESTful APIs
Microservices
Flask
SQL
icon
Mumbai, Navi Mumbai
icon
3 - 8 yrs
icon
₹4L - ₹22L / yr

We are establishing infrastructure for internal and external reporting using Tableau and are looking for someone with experience building visualizations and dashboards in Tableau and using Tableau Server to deliver them to internal and external users. 

 

Required Experience 

  • Implementation of interactive visualizations using Tableau Desktop  
  • Integration with Tableau Server and support of production dashboards and embedded reports with it 
  • Writing and optimization of SQL queries  
  • Proficient in Python including the use of Pandas and numpy libraries to perform data exploration and analysis 
  • 3  years of experience working as a Software Engineer / Senior Software Engineer 
  • Bachelors in Engineering – can be Electronic and comm , Computer , IT  
  • Well versed with Basic Data Structures Algorithms and system design 
  • Should be capable of working well in a team – and should possess very good communication skills 
  • Self-motivated and fun to work with and organized 
  • Productive and efficient working remotely 
  • Test driven mindset with a knack for finding issues and problems at earlier stages of development 
  • Interest in learning and picking up a wide range of cutting edge technologies 
  • Should be curious and interested in learning some Data science related concepts and domain knowledge 
  • Work alongside other engineers on the team to elevate technology and consistently apply best practices 

 

Highly Desirable 

  • Data Analytics 
  • Experience in AWS cloud or any cloud technologies 
  • Experience in BigData technologies and streaming like – pyspark, kafka is a big plus 
  • Shell scripting  
  • Preferred tech stack – Python, Rest API, Microservices, Flask/Fast API, pandas, numpy, linux, shell scripting, Airflow, pyspark 
  • Has a strong backend experience – and worked with Microservices and Rest API’s - Flask, FastAPI, Databases Relational and Non-relational 
Read more
Job posted by
Akshata Alekar

ETL Developer

at Product based Company

Agency job
via Crewmates
ETL
Big Data
icon
Coimbatore
icon
4 - 15 yrs
icon
₹5L - ₹25L / yr
Hi Professionals,
We are looking for ETL Developer for Reputed Client @ Coimbatore Permanent role
Work Location : Coimbatore
Experience : 4+ Years
Skills ;
  •  Talend (or)Strong experience in any of the ETL Tools like (Informatica/Datastage/Talend)
  • DB preference (Teradata /Oracle /Sql server )
  • Supporting Tools (JIRA/SVN)
Notice Period : Immediate to 30 Days
Read more
Job posted by
Gowtham V

Data Scientist

at Yottaasys AI LLC

Founded 2018  •  Product  •  0-20 employees  •  Raised funding
Data Science
Deep Learning
R Programming
Python
Machine Learning (ML)
Video compression
Data Analytics
icon
Bengaluru (Bangalore), Singapore
icon
2 - 5 yrs
icon
₹9L - ₹20L / yr
We are a US Headquartered Product Company looking to Hire a few Passionate Deep Learning and Computer Vision Team Players with 2-5 years of experience! If you are any of these:
1. Expert in deep learning and machine learning techniques,
2. Extremely Good in image/video processing,
3. Have a Good understanding of Linear algebra, Optimization techniques, Statistics and pattern recognition.
Then u r the right fit for this position.
Read more
Job posted by
Dinesh Krishnan

Sr. Data Scientist

at Our Client company is into Computer Software. (EC1)

Agency job
via Multi Recruit
Data Science
Data scientist
Data Analytics
Machine Learning (ML)
Python
PowerBI
SQL
icon
Bengaluru (Bangalore)
icon
5 - 7 yrs
icon
₹14.5L - ₹16.5L / yr
  • Actively engage with internal business teams to understand their challenges and deliver robust, data-driven solutions.
  • Work alongside global counterparts to solve data-intensive problems using standard analytical frameworks and tools.
  • Be encouraged and expected to innovate and be creative in your data analysis, problem-solving, and presentation of solutions.
  • Network and collaborate with a broad range of internal business units to define and deliver joint solutions.
  • Work alongside customers to leverage cutting-edge technology (machine learning, streaming analytics, and ‘real’ big data) to creatively solve problems and disrupt existing business models.

In this role, we are looking for:

  • A problem-solving mindset with the ability to understand business challenges and how to apply your analytics expertise to solve them.
  • The unique person who can present complex mathematical solutions in a simple manner that most will understand, including customers.
  • An individual excited by innovation and new technology and eager to finds ways to employ these innovations in practice.
  • A team mentality, empowered by the ability to work with a diverse set of individuals.

Basic Qualifications

  • A Bachelor’s degree in Data Science, Math, Statistics, Computer Science or related field with an emphasis on analytics.
  • 5+ Years professional experience in a data scientist/analyst role or similar.
  • Proficiency in your statistics/analytics/visualization tool of choice, but preferably in the Microsoft Azure Suite, including Azure ML Studio and PowerBI as well as R, Python, SQL.

Preferred Qualifications

  • Excellent communication, organizational transformation, and leadership skills
  • Demonstrated excellence in Data Science, Business Analytics and Engineering

 

 

 

 

 

Read more
Job posted by
Manjunath Multirecruit

Data Scientist

at Innoplexus

Founded 2010  •  Product  •  100-500 employees  •  Profitable
Data Science
Python
Machine Learning (ML)
Natural Language Processing (NLP)
Text mining
icon
Pune
icon
1 - 6 yrs
icon
₹6L - ₹20L / yr
Innoplexus offers Data as a Service and Continuous Analytics as a Service product, leveraging Artificial Intelligence and advanced analytics to help reduce the time to market, significantly.Our products leverage proprietary algorithms and patent-pending technologies to help global Life sciences & Financial services organizations with access to relevant data, real-time intelligence & intuitive insights, across the life cycle of the products.We automate the collection, curation, aggregation, analysis & visualization, of billions of data points from thousands of data sources, using domain-specific language processing, ontologies, computer vision, machine learning, network analysis and more.Location: PuneRequired qualification MS in Computer Science, Statistics, Applied Maths or related domain.Key Responsibilities:- Use machine learning & deep learning techniques to create new, scalable solutions for business problems.- Develop NLP & computer vision-based tools and technologies for acquiring, parsing, interpreting and visualizing unstructured data- Analyze and extract relevant information from large amounts of data to help in automating the solutions and optimizing key processes.- Help the team in building large scale continual/online learning system.- Help team to build experimentation to the production pipeline.- Stay current with the latest research and technology and communicate your knowledge throughout the enterprise- Come up with patentable ideas that provide us a competitive advantage.Required Experience:- Strong track record in AI / ML publications in renowned scientific journals or conferences.- Experience in any of the following: Computer Vision, Image Processing, Speech- Recognition, Natural Language Understanding, Machine Learning, Deep Learning, HCI,- Text Mining, Computational Genomics, Bioinformatics, other Machine Intelligence/Artificial Intelligence related areas.- Programming experience in one or more of the following: C, C++, Python.
Read more
Job posted by
Amar Navgire
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Capgemini?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort