|Job Title: Data Engineer|
|Tech Job Family: DACI|
|• Bachelor's Degree in Engineering, Computer Science, CIS, or related field (or equivalent work experience in a related field)|
|• 2 years of experience in Data, BI or Platform Engineering, Data Warehousing/ETL, or Software Engineering|
|• 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)|
|• Master's Degree in Computer Science, CIS, or related field|
|• 2 years of IT experience developing and implementing business systems within an organization|
|• 4 years of experience working with defect or incident tracking software|
|• 4 years of experience with technical documentation in a software development environment|
|• 2 years of experience working with an IT Infrastructure Library (ITIL) framework|
|• 2 years of experience leading teams, with or without direct reports|
|• Experience with application and integration middleware|
|• Experience with database technologies|
|• 2 years of experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role)|
|• Expertise in Java/Scala/Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering role)|
|• Expertise in MicroStrategy/Power BI/SQL, Scripting, Teradata or equivalent RDBMS, Hadoop (OLAP on Hadoop), Dashboard development, Mobile development (specific to the BI Engineering role)|
|• 2 years of experience in Hadoop, NO-SQL, RDBMS or any Cloud Bigdata components, Teradata, MicroStrategy (specific to the Platform Engineering role)|
|• Expertise in Python, SQL, Scripting, Teradata, Hadoop utilities like Sqoop, Hive, Pig, Map Reduce, Spark, Ambari, Ranger, Kafka or equivalent Cloud Bigdata components (specific to the Platform Engineering role)|
|Lowe’s is an equal opportunity employer and administers all personnel practices without regard to race, color, religion, sex, age, national origin, disability, sexual orientation, gender identity or expression, marital status, veteran status, genetics or any other category protected under applicable law.|
Data & Technology -This function is an Analytics, Technology, and consulting group supporting the buying & campaign delivery teams. We combine Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people.
This role is a fantastic opportunity for personal and professional growth and to contribute to a high-performance team, focused on continuous learning, rigorous best practice and achieving high levels of customer service. The role requires a top-class candidate with excellent numeracy and proven analytics problem-solving skills to join our high energy, entrepreneurial team.
Reporting of the role
This role reports to the Analytics Director.
3 best things about the job:
- Be a member of a high performing team focused on technology, data, partners and platforms, a key strategic growth area for GroupM and WPP.
- Work in an environment that promotes freedom, flexibility, empowerment, and diverse working styles to solve real business problems.
- The opportunity to learn & collaborate with a wide range of stakeholders across all GroupM agencies & business units.
Measures of success –
In three months:
- Gain an in depth understanding of the media landscape, be trained on the various media buying platforms specifically, data & analytics databases and tools and understand how GMS business operates
- Lead and roll out various analytics and attribution frameworks and best practices for campaign measurement
- Develop proficiency in clean room analytics such as ADH, Infosum, Liveramp etc.
- Develop relationships and earn trust with your own team
In six months:
- Working with the campaign delivery teams to deliver high value, in-depth analytics, and attribution including client site analytics, channel analytics, automated where possible. Part of this will be to ensure that prior to the campaign all tracking and assets are in place as required by the briefing, then monitoring throughout the campaign that data is being collected.
- Help develop standard and where possible automated advanced clean room analytics solutions that can be scaled across all agencies.
- Perform active stakeholder management to continue to evolve these analytics solutions as per the priority requirements.
In twelve months:
- Work with the APAC GMS teams to ensure the local and regional data analytics solutions are aligned and local needs are strongly represented at the regional / global level
- Develop proficiency in measurement frameworks in a post cookie era, leading experiments for measuring campaign delivery, brand health and marketing effectives / ROI.
- Be an expert in data and lead bespoke insight analytics work as the demand and function continues to grow – i.e. answering complex business problems posed by our clients, providing thought leadership in defining measurement strategies, etc.
Responsibilities of the role:
- Provide digital campaign analytics – including campaign delivery, measurement, and attribution
- Client site analytics – e.g., Google Analytics, Adobe Analytics
- Client channel analytics – e.g., social listening, ecommerce – shopalyst, pre-post purchase analytics, pricing benchmarks
- Create omni(digital)-channel measurement strategies for performance reporting
- Deploy data-driven attribution models to support campaign optimisation
- Develop and roll out frameworks around various attribution models
- Create a leading analytics solution suite leveraging media / neutral data clean rooms
- Foster a community of data analytics practitioners for knowledge sharing and growing expertise
What you will need:
- Min 4 –5 years’ experience working within an analytical role
- Prior experience within a digital media role is highly desirable, particularly search, social and programmatic
- A degree in a quantitative field (e.g. economics, computer science, mathematics, statistics, engineering, physics, etc.)
- Proficiency in Excel (including but not limited to VLOOKUP’s, arrays, pivot tables, conditional and nested formulas, VBA/macros)
- Experience with SQL/ Big Query/GMP tech stack / Clean rooms such as ADH
- Hands-on experience on BI/Visual Analytics Tools like PowerBI or Tableau
- Knowledge or hands-on experience on analytics platforms like Google Analytics, Data Studio, Adobe Analytics, MMP such as Firebase, Appsflyer, Kochava etc.
- Evidence of technical comfort and good understanding of internet functionality desirable
- Analytical pedigree - evidence of having approached problems from a mathematical perspective and working through to a solution in a logical way
- Proactive and results-oriented
- A positive, can-do attitude with a thirst to continually learn new things
- An ability to work independently and collaboratively with a wide range of teams
- Excellent communication skills, both written and oral
- An interest in media, advertising and marketing
More about GroupM
GroupM - GroupM leads and shapes media markets by delivering performance enhancing media products and services, powered by data and technology. Our global network agencies and businesses enable our people to work collaboratively across borders with the best in class, providing them the opportunity to accelerate their progress and development. We are not limited by teams or geographies; our scale and diverse range of clients lets us be more adventurous with our business and talent. We give our talent the space, support and tools to innovate and grow.
Discover more about GroupM at www.groupm.com
Follow @GroupMAPAC on Twitter
Follow GroupM on LinkedIn - https://www.linkedin.com/company/groupm
2020 brought opportunities for brands to innovate because of which we saw an evolving media stack. The growth of digital is set to soar high because of changing consumer habits. With approximately 500 million smartphone users, low-priced data plans, 45 to 50 million e-commerce shoppers, approximately 60 OTT offerings and a young population, India is a mobile-first internet market. It is also one of the top 10 ad spend markets in the world and is set to climb the ranks. Global big tech corporations have made considerable investments in top e-commerce/retail ventures and Indian start-ups, blurring the lines between social media, e-commerce and mobile payments, resulting in disruption on an unimaginable scale.
At GroupM India, there’s never a dull moment between juggling client requests, managing vendor partners and having fun with your team. We believe in tackling challenges head-on and getting things done.
GroupM is an equal opportunity employer. We view everyone as an individual and we understand that inclusion is more than just diversity – it’s about belonging. We celebrate the fact that everyone is unique and that’s what makes us so good at what we do. We pride ourselves on being a company that embraces difference and truly represents the global clients we work with.
Purpose of Job:
Responsible for drawing insights from many sources of data to answer important business
questions and help the organization make better use of data in their daily activities.
We are looking for a smart and experienced Data Engineer 1 who can work with a senior
⮚ Build DevOps solutions and CICD pipelines for code deployment
⮚ Build unit test cases for APIs and Code in Python
⮚ Manage AWS resources including EC2, RDS, Cloud Watch, Amazon Aurora etc.
⮚ Build and deliver high quality data architecture and pipelines to support business
and reporting needs
⮚ Deliver on data architecture projects and implementation of next generation BI
⮚ Interface with other teams to extract, transform, and load data from a wide variety
of data sources
Education: MS/MTech/Btech graduates or equivalent with focus on data science and
quantitative fields (CS, Eng, Math, Eco)
Work Experience: Proven 1+ years of experience in data mining (SQL, ETL, data
warehouse, etc.) and using SQL databases
⮚ Proficient in Python and SQL. Familiarity with statistics or analytical techniques
⮚ Data Warehousing Experience with Big Data Technologies (Hadoop, Hive,
Hbase, Pig, Spark, etc.)
⮚ Working knowledge of tools and utilities - AWS, DevOps with Git, Selenium,
Postman, Airflow, PySpark
⮚ Deep Curiosity and Humility
⮚ Excellent storyteller and communicator
⮚ Design Thinking
• Experience in understanding and translating data, analytic requirements and functional
needs into technical requirements while working with global customers
• Build and maintain data pipelines to support large scale data management in alignment with
data strategy and data processing standards
• Experience in Database programming using multiple flavor of SQL
• Deploy scalable data pipelines for analytical needs
• Experience in Big Data ecosystem - on-prem (Hortonworks/MapR) or Cloud
• Worked on query languages/tools such as Hadoop, Pig, SQL, Hive, Sqoop and SparkSQL.
• Experience in any orchestration tool such as Airflow/Oozie for scheduling pipelines
• Exposure to latest cloud ETL tools such as Glue/ADF/Dataflow
• Understand and execute IN memory distributed computing frameworks like Spark (and/or
DataBricks) and its parameter tuning, writing optimized queries in Spark
• Hands-on experience in using Spark Streaming, Kafka and Hbase
• Experience working in an Agile/Scrum development process
Artificial Intelligence (AI) Researchers and Developers
Successful candidate will be part of highly productive teams working on implementing core AI algorithms, Cryptography libraries, AI enabled products and intelligent 3D interface. Candidates will work on cutting edge products and technologies in highly challenging domains and will need to have highest level of commitment and interest to learn new technologies and domain specific subject matter very quickly. Successful completion of projects will require travel and working in remote locations with customers for extended periods
Education Qualification: Bachelor, Master or PhD degree in Computer Science, Mathematics, Electronics, Information Systems from a reputed university and/or equivalent Knowledge and Skills
Location : Hyderabad, Bengaluru, Delhi, Client Location (as needed)
Skillset and Expertise
• Strong software development experience using Python
• Strong background in mathematical, numerical and scientific computing using Python.
• Knowledge in Artificial Intelligence/Machine learning
• Experience working with SCRUM software development methodology
• Strong experience with implementing Web services, Web clients and JSON protocol is required
• Experience with Python Meta programming
• Strong analytical and problem-solving skills
• Design, develop and debug enterprise grade software products and systems
• Software systems testing methodology, including writing and execution of test plans, debugging, and testing scripts and tools
• Excellent written and verbal communication skills; Proficiency in English. Verbal communication in Hindi and other local
• Ability to effectively communicate product design, functionality and status to management, customers and other stakeholders
• Highest level of integrity and work ethic
7. Apache Kafka
1. Advanced Calculus
2. Numerical Analysis
3. Complex Function Theory
Concepts (One or more of the below)
1. OpenGL based 3D programming
3. Artificial Intelligence (AI) Algorithms a) Statistical modelling b.) DNN c. RNN d. LSTM e.GAN f. CN
· 3+ years of relevant technical experience as a data analyst role
· Intermediate / expert skills with SQL and basic statistics
· Experience in Advance SQL
· Python programming- Added advantage
· Strong problem solving and structuring skills
· Automation in connecting various sources to the data and representing it through various dashboards
· Excellent with Numbers and communicate data points through various reports/templates
· Ability to communicate effectively internally and outside Data Analytics team
· Proactively take up work responsibilities and take adhocs as and when needed
· Ability and desire to take ownership of and initiative for analysis; from requirements clarification to deliverable
· Strong technical communication skills; both written and verbal
· Ability to understand and articulate the "big picture" and simplify complex ideas
· Ability to identify and learn applicable new techniques independently as needed
· Must have worked with various Databases (Relational and Non-Relational) and ETL processes
· Must have experience in handling large volume and data and adhere to optimization and performance standards
· Should have the ability to analyse and provide relationship views of the data from different angles
· Must have excellent Communication skills (written and oral).
· Knowing Data Science is an added advantage
MYSQL, Advanced Excel, Tableau, Reporting and dashboards, MS office, VBA, Analytical skills
· Strong understanding of relational database MY SQL etc.
· Prior experience working remotely full-time
· Prior Experience working in Advance SQL
· Experience with one or more BI tools, such as Superset, Tableau etc.
· High level of logical and mathematical ability in Problem Solving
We are establishing infrastructure for internal and external reporting using Tableau and are looking for someone with experience building visualizations and dashboards in Tableau and using Tableau Server to deliver them to internal and external users.
- Implementation of interactive visualizations using Tableau Desktop
- Integration with Tableau Server and support of production dashboards and embedded reports with it
- Writing and optimization of SQL queries
- Proficient in Python including the use of Pandas and numpy libraries to perform data exploration and analysis
- 3 years of experience working as a Software Engineer / Senior Software Engineer
- Bachelors in Engineering – can be Electronic and comm , Computer , IT
- Well versed with Basic Data Structures Algorithms and system design
- Should be capable of working well in a team – and should possess very good communication skills
- Self-motivated and fun to work with and organized
- Productive and efficient working remotely
- Test driven mindset with a knack for finding issues and problems at earlier stages of development
- Interest in learning and picking up a wide range of cutting edge technologies
- Should be curious and interested in learning some Data science related concepts and domain knowledge
- Work alongside other engineers on the team to elevate technology and consistently apply best practices
- Data Analytics
- Experience in AWS cloud or any cloud technologies
- Experience in BigData technologies and streaming like – pyspark, kafka is a big plus
- Shell scripting
- Preferred tech stack – Python, Rest API, Microservices, Flask/Fast API, pandas, numpy, linux, shell scripting, Airflow, pyspark
- Has a strong backend experience – and worked with Microservices and Rest API’s - Flask, FastAPI, Databases Relational and Non-relational
We are looking for ETL Developer for Reputed Client @ Coimbatore Permanent role
Work Location : Coimbatore
Experience : 4+ Years
- Talend (or)Strong experience in any of the ETL Tools like (Informatica/Datastage/Talend)
- DB preference (Teradata /Oracle /Sql server )
- Supporting Tools (JIRA/SVN)
1. Expert in deep learning and machine learning techniques,
2. Extremely Good in image/video processing,
3. Have a Good understanding of Linear algebra, Optimization techniques, Statistics and pattern recognition.
Then u r the right fit for this position.
- Actively engage with internal business teams to understand their challenges and deliver robust, data-driven solutions.
- Work alongside global counterparts to solve data-intensive problems using standard analytical frameworks and tools.
- Be encouraged and expected to innovate and be creative in your data analysis, problem-solving, and presentation of solutions.
- Network and collaborate with a broad range of internal business units to define and deliver joint solutions.
- Work alongside customers to leverage cutting-edge technology (machine learning, streaming analytics, and ‘real’ big data) to creatively solve problems and disrupt existing business models.
In this role, we are looking for:
- A problem-solving mindset with the ability to understand business challenges and how to apply your analytics expertise to solve them.
- The unique person who can present complex mathematical solutions in a simple manner that most will understand, including customers.
- An individual excited by innovation and new technology and eager to finds ways to employ these innovations in practice.
- A team mentality, empowered by the ability to work with a diverse set of individuals.
- A Bachelor’s degree in Data Science, Math, Statistics, Computer Science or related field with an emphasis on analytics.
- 5+ Years professional experience in a data scientist/analyst role or similar.
- Proficiency in your statistics/analytics/visualization tool of choice, but preferably in the Microsoft Azure Suite, including Azure ML Studio and PowerBI as well as R, Python, SQL.
- Excellent communication, organizational transformation, and leadership skills
- Demonstrated excellence in Data Science, Business Analytics and Engineering