Cutshort logo
World's largest Media Investment Company logo
Senior Executive - Analytics (Marketing Mix Model)
Senior Executive - Analytics (Marketing Mix Model)
World's largest Media Investment Company's logo

Senior Executive - Analytics (Marketing Mix Model)

Agency job
via Merito
2 - 6 yrs
₹16L - ₹18L / yr
Bengaluru (Bangalore), Gurugram
Skills
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
Spotfire
Marketing Mix
Attribution Models
skill iconPython
R

Senior Executive - Analytics


Overview of job :-


Our Client is the world’s largest media investment company which is a part of WPP. They are a global digital transformation agency with 1200 employees across 21 nations. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channels.


We are currently looking for a Sr Executive – Analytics to join us. In this role, you will be responsible for a massive opportunity to build and be a part of largest performance marketing setup APAC is committed to fostering a culture of diversity and inclusion. Our people are our strength so we respect and nurture their individual talent and potential.


Reporting of the role - This role reports to the Director - Analytics,


3 best things about the job:


1. Responsible for data & analytics projects and developing data strategies by diving into data and extrapolating insights and providing guidance to clients


2. Build and be a part of a dynamic team


3. Being part of a global organisations with rapid growth opportunities


Responsibilities of the role:


 Build Marketing-Mix and Multi-Touch attribution models using a range of tools, including free and paid.

 Work with large data sets via hands-on data processing to produce structured data sets for analysis.

 Design and build Visualization, Dashboard and reports for both Internal and external clients using Tableau, Power BI, Datorama or R Shiny/Python.


What you will need:


 Degree in Mathematics, Statistics, Economics, Engineering, Data Science, Computer Science or quantitative field.

 2-3 years’ experience in Marketing/Data Analytics or related field with hands-on experience in building Marketing-Mix and Attribution models.  Proficiency in one or more coding languages – preferred languages: Python, R

 Proficiency in one or more Visualization Tools – Tableau, Datorama, Power BI

 Proficiency in using SQL.

 Proficiency with one or more statistical tools is a plus – Example: SPSS, SAS, MATLAB, Mathcad.

 Working experience using big data technologies (Hive/Hadoop) is a plus

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About World's largest Media Investment Company

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Publicis Sapient
at Publicis Sapient
10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
Bengaluru (Bangalore)
5 - 9 yrs
₹10L - ₹18L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+10 more

Requirements

Experience

  • 5+ years of professional experience in implementing MLOps framework to scale up ML in production.
  • Hands-on experience with Kubernetes, Kubeflow, MLflow, Sagemaker, and other ML model experiment management tools including training, inference, and evaluation.
  • Experience in ML model serving (TorchServe, TensorFlow Serving, NVIDIA Triton inference server, etc.)
  • Proficiency with ML model training frameworks (PyTorch, Pytorch Lightning, Tensorflow, etc.).
  • Experience with GPU computing to do data and model training parallelism.
  • Solid software engineering skills in developing systems for production.
  • Strong expertise in Python.
  • Building end-to-end data systems as an ML Engineer, Platform Engineer, or equivalent.
  • Experience working with cloud data processing technologies (S3, ECR, Lambda, AWS, Spark, Dask, ElasticSearch, Presto, SQL, etc.).
  • Having Geospatial / Remote sensing experience is a plus.
Read more
Matellio India Private Limited
Harshit Sharma
Posted by Harshit Sharma
Remote only
3 - 6 yrs
₹3L - ₹15L / yr
skill iconPython
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
+2 more
This role is primarily responsible for building AIML models and cognitive applications

Principal Accountabilities :

1. Good in communication and converting business requirements to functional requirements

2. Develop data-driven insights and machine learning models to identify and extract facts from sales, supply chain and operational data

3. Sound Knowledge and experience in statistical and data mining techniques: Regression, Random Forest, Boosting Trees, Time Series Forecasting, etc.

5. Experience in SOTA Deep Learning techniques to solve NLP problems.

6. End-to-end data collection, model development and testing, and integration into production environments.

7. Build and prototype analysis pipelines iteratively to provide insights at scale.

8. Experience in querying different data sources

9. Partner with developers and business teams for the business-oriented decisions

10. Looking for someone who dares to move on even when the path is not clear and be creative to overcome challenges in the data.
Read more
Archwell
Agency job
via AVI Consulting LLP by Sravanthi Puppala
Mysore
2 - 8 yrs
₹1L - ₹15L / yr
Snowflake
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
Windows Azure
+6 more

Title:  Data Engineer – Snowflake

 

Location: Mysore (Hybrid model)

Exp-2-8 yrs

Type: Full Time

Walk-in date: 25th Jan 2023 @Mysore 

 

Job Role: We are looking for an experienced Snowflake developer to join our team as a Data Engineer who will work as part of a team to help design and develop data-driven solutions that deliver insights to the business. The ideal candidate is a data pipeline builder and data wrangler who enjoys building data-driven systems that drive analytical solutions and building them from the ground up. You will be responsible for building and optimizing our data as well as building automated processes for production jobs. You will support our software developers, database architects, data analysts and data scientists on data initiatives

 

Key Roles & Responsibilities:

  • Use advanced complex Snowflake/Python and SQL to extract data from source systems for ingestion into a data pipeline.
  • Design, develop and deploy scalable and efficient data pipelines.
  • Analyze and assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements. For example: automating manual processes, optimizing data delivery, re-designing data platform infrastructure for greater scalability.
  • Build required infrastructure for optimal extraction, loading, and transformation (ELT) of data from various data sources using AWS and Snowflake leveraging Python or SQL technologies.
  • Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
  • Create and configure appropriate cloud resources to meet the needs of the end users.
  • As needed, document topology, processes, and solution architecture.
  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies

 

Qualifications & Experience

Qualification & Experience Requirements:

  • Bachelor's degree in computer science, computer engineering, or a related field.
  • 2-8 years of experience working with Snowflake
  • 2+ years of experience with the AWS services.
  • Candidate should able to write the stored procedure and function in Snowflake.
  • At least 2 years’ experience in snowflake developer.
  • Strong SQL Knowledge.
  • Data injection in snowflake using Snowflake procedure.
  • ETL Experience is Must (Could be any tool)
  • Candidate should be aware of snowflake architecture.
  • Worked on the Migration project
  • DW Concept (Optional)
  • Experience with cloud data storage and compute components including lambda functions, EC2s, containers.
  • Experience with data pipeline and workflow management tools: Airflow, etc.
  • Experience cleaning, testing, and evaluating data quality from a wide variety of ingestible data sources
  • Experience working with Linux and UNIX environments.
  • Experience with profiling data, with and without data definition documentation
  • Familiar with Git
  • Familiar with issue tracking systems like JIRA (Project Management Tool) or Trello.
  • Experience working in an agile environment.

Desired Skills:

  • Experience in Snowflake. Must be willing to be Snowflake certified in the first 3 months of employment.
  • Experience with a stream-processing system: Snowpipe
  • Working knowledge of AWS or Azure
  • Experience in migrating from on-prem to cloud systems
Read more
MSMEx
at MSMEx
6 recruiters
Sujata Ranjan
Posted by Sujata Ranjan
Remote, Mumbai, Pune
4 - 6 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data Analysis
Data Analyst
SQL
skill iconPython
+4 more

We are looking for a Data Analyst that oversees organisational data analytics. This will require you to design and help implement the data analytics platform that will keep the organisation running. The team will be the go-to for all data needs for the app and we are looking for a self-starter who is hands on and yet able to abstract problems and anticipate data requirements.
This person should be very strong technical data analyst who can design and implement data systems on his own. Along with him, he also needs to be proficient in business reporting and should have keen interest in provided data needed for business.

 

Tools familiarity:  SQL, Python, Mix panel, Metabase, Google Analytics,  Clever Tap, App Analytics

Responsibilities

  • Processes and frameworks for metrics, analytics, experimentation and user insights, lead the data analytics team
  • Metrics alignment across teams to make them actionable and promote accountability
  • Data based frameworks for assessing and strengthening Product Market Fit
  • Identify viable growth strategies through data and experimentation
  • Experimentation for product optimisation and understanding user behaviour
  • Structured approach towards deriving user insights, answer questions using data
  • This person needs to closely work with Technical and Business teams to get this implemented.

Skills

  • 4 to 6 years at a relevant role in data analytics in a Product Oriented company
  • Highly organised, technically sound & good at communication
  • Ability to handle & build for cross functional data requirements / interactions with teams
  • Great with Python, SQL
  • Can build, mentor a team
  • Knowledge of key business metrics like cohort, engagement cohort, LTV, ROAS, ROE

 

Eligibility

BTech or MTech in Computer Science/Engineering from a Tier1, Tier2 colleges

 

Good knowledge on Data Analytics, Data Visualization tools. A formal certification would be added advantage.

We are more interested in what you CAN DO than your location, education, or experience levels.

 

Send us your code samples / GitHub profile / published articles if applicable.

Read more
Graphene Services Pte Ltd
Swetha Seshadri
Posted by Swetha Seshadri
Remote, Bengaluru (Bangalore)
3 - 7 yrs
Best in industry
PyTorch
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconPython
skill iconMachine Learning (ML)
+8 more
ML Engineer
WE ARE GRAPHENE

Graphene is an award-winning AI company, developing customized insights and data solutions for corporate clients. With a focus on healthcare, consumer goods and financial services, our proprietary AI platform is disrupting market research with an approach that allows us to get into the mind of customers to a degree unprecedented in traditional market research.

Graphene was founded by corporate leaders from Microsoft and P&G and works closely with the Singapore Government & universities in creating cutting edge technology. We are gaining traction with many Fortune 500 companies globally.

Graphene has a 6-year track record of delivering financially sustainable growth and is one of the few start-ups which are self-funded, yet profitable and debt free.

We already have a strong bench strength of leaders in place. Now, we are looking to groom more talents for our expansion into the US. Join us and take both our growths to the next level!

 

WHAT WILL THE ENGINEER-ML DO?

 

  • Primary Purpose: As part of a highly productive and creative AI (NLP) analytics team, optimize algorithms/models for performance and scalability, engineer & implement machine learning algorithms into services and pipelines to be consumed at web-scale
  • Daily Grind: Interface with data scientists, project managers, and the engineering team to achieve sprint goals on the product roadmap, and ensure healthy models, endpoints, CI/CD,
  • Career Progression: Senior ML Engineer, ML Architect

 

YOU CAN EXPECT TO

  • Work in a product-development team capable of independently authoring software products.
  • Guide junior programmers, set up the architecture, and follow modular development approaches.
  • Design and develop code which is well documented.
  • Optimize of the application for maximum speed and scalability
  • Adhere to the best Information security and Devops practices.
  • Research and develop new approaches to problems.
  • Design and implement schemas and databases with respect to the AI application
  • Cross-pollinated with other teams.

 

HARD AND SOFT SKILLS

Must Have

  • Problem-solving abilities
  • Extremely strong programming background – data structures and algorithm
  • Advanced Machine Learning: TensorFlow, Keras
  • Python, spaCy, NLTK, Word2Vec, Graph databases, Knowledge-graph, BERT (derived models), Hyperparameter tuning
  • Experience with OOPs and design patterns
  • Exposure to RDBMS/NoSQL
  • Test Driven Development Methodology

 

Good to Have

  • Working in cloud-native environments (preferably Azure)
  • Microservices
  • Enterprise Design Patterns
  • Microservices Architecture
  • Distributed Systems
Read more
Taliun
at Taliun
3 recruiters
Pankaj G
Posted by Pankaj G
Pune
6 - 11 yrs
₹3L - ₹18L / yr
skill iconData Analytics
Business Intelligence (BI)
Microsoft Business Intelligence (MSBI)
skill iconGoogle Analytics
skill iconMongoDB
+1 more
Job Description :- 5 to 11 years experience- Experienced any of the BI tools like PowerBI, QlikView.- Writing SQLqueries/functions/procedures on big data.- Apt in databases NO SQL : mongoDB/ Cassandra OR MySQL, PostgresSQL, SQLServer- Able to Analyze clean,massage, cleanse and organize raw (big) data. - Manage Security for data on AWS or Azure cloud platform - Create, validate and maintain optimal data pipelines, assemble large, complex data sets - Helps in Structuring for upstream / downstream processing
Read more
US Healthcare
Agency job
via turtlebowl by swati m
Bengaluru (Bangalore)
4 - 8 yrs
₹10L - ₹11L / yr
skill iconData Analytics
Relational Database (RDBMS)
Dashboard Manager
Reporting
Trend analysis
+1 more
About - 

 
Relevant Years of Exp  - Minimum 4-8 years of experience in data analysis, data reporting, identify and analyze pattern/trends.
 
Knowledge Skill Sets -
 
§ Experience with Tableau dashboard 

§ Careful and attentive to details 

§ Willing and eager to call out mistakes 

§ Beginner to intermediate knowledge of relational databases, reporting, business intelligence 

§ Professional communicator 

§ Inquisitive/Curious readily asking questions about anything that doesn’t make sense or feels right 

§ Good interpersonal skills with a proven ability to communicate effectively (both written and verbal). 

§ Well-developed skill in MS Excel 

§ Displays awareness of the need for confidentially in sensitive matters. 

§ Eye for detailing.
 
Role Description - 
 
§ Execute tasks assigned by reporting manager and/or Bedford SPOC 

§ Identify, analyze, and interpret trends or patterns 

§ Audit and report discrepancies/inconsistencies in Tableau reports/dashboards 

§ Publish weekly/monthly reports in pre-defined format and frequency to reporting manager
 
Job Purpose - 
§ Prepare reports using Tableau for delivery to clients 

§ Adjust parameters and prepare custom reports using previously built dashboards. 

§ Print reports to PDF and deliver to folders on a predetermined schedule 

§ Become familiar with Tableau - our clients, created workbooks, parameters, filters, and databases 

§ QA existing dashboards and look for inconsistencies in naming, filters, charts, tables, etc
 
Note - Looking for Immediate joiner with N.P. of 30 days.
Timing - US shift (6 p.m. - 3.30 a.m.)
Benefits - Transport facility + Night shift Allowance.
Location - Domlur.
Working - 5 days.
 
Reach me ASAP if exploring opportunity with updated resume -
--
Thanks and Regards,
M.Swati 
Associate Consultant
 
#intelligenthiring
India | Singapore 
http://www.turtlebowl.com/" target="_blank">www.turtlebowl.com 
Read more
Pluto Seven Business Solutions Pvt Ltd
Sindhu Narayan
Posted by Sindhu Narayan
Bengaluru (Bangalore)
3 - 9 yrs
₹6L - ₹18L / yr
MySQL
skill iconPython
Big Data
Google Cloud Storage
API
+3 more
Data Engineer: Pluto7 is a services and solutions company focused on building ML, Ai, Analytics, solutions to accelerate business transformation. We are a Premier Google Cloud Partner, servicing Retail, Manufacturing, Healthcare, and Hi-Tech industries.We’re seeking passionate people to work with us to change the way data is captured, accessed and processed, to make data driven insightful decisions. Must have skills : Hands-on experience in database systems (Structured and Unstructured). Programming in Python, R, SAS. Overall knowledge and exposure on how to architect solutions in cloud platforms like GCP, AWS, Microsoft Azure. Develop and maintain scalable data pipelines, with a focus on writing clean, fault-tolerant code. Hands-on experience in data model design, developing BigQuery/SQL (any variant) stored. Optimize data structures for efficient querying of those systems. Collaborate with internal and external data sources to ensure integrations are accurate, scalable and maintainable. Collaborate with business intelligence/analytics teams on data mart optimizations, query tuning and database design. Execute proof of concepts to assess strategic opportunities and future data extraction and integration capabilities. Must have at least 2 years of experience in building applications, solutions and products based on analytics. Data extraction, Data cleansing and transformation. Strong knowledge on REST APIs, Http Server, MVC architecture. Knowledge on continuous integration/continuous deployment. Preferred but not required: Machine learning and Deep learning experience Certification on any cloud platform is preferred. Experience of data migration from On-Prem to Cloud environment. Exceptional analytical, quantitative, problem-solving, and critical thinking skills Excellent verbal and written communication skills Work Location: Bangalore
Read more
INSTAFUND INTERNET PRIVATE LIMITED
Pruthiraj Rath
Posted by Pruthiraj Rath
Chennai
1 - 3 yrs
₹3L - ₹6L / yr
skill iconReact.js
skill iconJavascript
skill iconPython
LAMP Stack
skill iconMongoDB
+2 more
At Daddyswallet, we’re using today’s technology to bring significant disruptive innovation to the financial industry. We focus on improving the lives of consumers by delivering simple, honest and transparent financial products.Looking for Fullstack developer having skills mainly in React native,react js.python.node js.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos