Senior Product Analyst
Pampers Start Up Team
India / Remote Working
Team Description
Our internal team focuses on App Development with data a growing area within the structure. We have a clear vision and strategy which is coupled up with App Development, Data, Testing, Solutions and Operations. The data team sits across the UK and India whilst other teams sit across Dubai, Lebanon, Karachi and various cities in India.
Role Description
In this role you will use a range of tools and technologies to primarily working on providing data design, data governance, reporting and analytics on the Pampers App.
This is a unique opportunity for an ambitious candidate to join a growing business where they will get exposure to a diverse set of assignments, can contribute fully to the growth of the business and where there are no limits to career progression and reward.
Responsibilities
● To be the Data Steward and drive governance having full understanding of all the data that flows through the Apps to all systems
● Work with the campaign team to do data fixes when issues with campaigns
● Investigate and troubleshoot issues with product and campaigns giving clear RCA and impact analysis
● Document data, create data dictionaries and be the “go to” person in understanding what data flows
● Build dashboards and reports using Amplitude, Power BI and present to the key stakeholders
● Carry out adhoc data investigations into issues with the app and present findings back querying data in BigQuery/SQL/CosmosDB
● Translate analytics into a clear powerpoint deck with actionable insights
● Write up clear documentation on processes
● Innovate with new processes or ways of providing analytics and reporting
● Help the data lead to find new ways of adding value
Requirements
● Bachelor’s degree and a minimum of 4+ years’ experience in an analytical role preferably working in product analytics with consumer app data
● Strong SQL Server and Power BI required
● You have experience with most or all of these tools – SQL Server, Python, Power BI, BigQuery.
● Understanding of mobile app data (Events, CTAs, Screen Views etc)
● Knowledge of data architecture and ETL
● Experience in analyzing customer behavior and providing insightful recommendations
● Self-starter, with a keen interest in technology and highly motivated towards success
● Must be proactive and be prepared to address meetings
● Must show initiative and desire to learn business subjects
● Able to work independently and provide updates to management
● Strong analytical and problem-solving capabilities with meticulous attention to detail
● Excellent problem-solving skills; proven teamwork and communication skills
● Experience working in a fast paced “start-up like” environment
Desirable
- Knowledge of mobile analytical tools (Segment, Amplitude, Adjust, Braze and Google Analytics)
- Knowledge of loyalty data
About AYM Marketing Management
Similar jobs
ADF Developer with top Conglomerates for Kochi location_ Air India
conducting F2F Interviews on 22nd April 2023
Experience - 2-12 years.
Location - Kochi only (work from the office only)
Notice period - 1 month only.
If you are interested, please share the following information at your earliest
Technical Knowledge (Must Have)
- Strong experience in SQL / HiveQL/ AWS Athena,
- Strong expertise in the development of data pipelines (snaplogic is preferred).
- Design, Development, Deployment and administration of data processing applications.
- Good Exposure towards AWS and Azure Cloud computing environments.
- Knowledge around BigData, AWS Cloud Architecture, Best practices, Securities, Governance, Metadata Management, Data Quality etc.
- Data extraction through various firm sources (RDBMS, Unstructured Data Sources) and load to datalake with all best practices.
- Knowledge in Python
- Good knowledge in NoSQL technologies (Neo4J/ MongoDB)
- Experience/knowledge in SnapLogic (ETL Technologies)
- Working knowledge on Unix (AIX, Linux), shell scripting
- Experience/knowledge in Data Modeling. Database Development
- Experience/knowledge creation of reports and dashboards in Tableau/ PowerBI
- Cloud: GCP
- Must have: BigQuery, Python, Vertex AI
- Nice to have Services: Data Plex
- Exp level: 5-10 years.
- Preferred Industry (nice to have): Manufacturing – B2B sales
Key Responsibilities:
- Understand Business requirements in BI context and design data models to transform raw data into meaningful insights
- Create dashboards and interactive visual reports using Power BI
- Analyze data and present data through reports that aid decision-making
- Design, develop, test, and deploy Power BI scripts and perform detailed analytics
- Communicate with Business units & leadership team and aim at better visualization and transparency of data analytics, driving insightful business strategies and improved business performances
- Partner with other Technology teams to create Data models, Charts & Reports in a scalable and controlled environment
Knowledge/Experience:
- A bachelor’s degree in data science, data analytics, computer science, mathematics, statistics, econometrics, financial engineering, computer or electrical engineering, or other related quantitative fields
- 4-5 years of experience in Data Preparation & Analysis, Business Intelligence using Data visualization tools (e.g.: Power BI) and sound knowledge of JavaScript, SQL
- An understanding of Machine Learning and Artificial Intelligence (AI)
- Experience with analysis in FMCG or Electronic sector is preferred
- Strong knowledge of Microsoft Excel & PowerPoint
Skills/Qualifications:
- Strong analytical skills and ability to work in a fast-paced environment
- Business acumen to understand complex business problems and translate them into analysis that leads to actionable business insights
- Strong communication skills (oral and written) to explain complex data and analytical problems
- Willingness to learn and to explore new ideas, with independent thinking and attention to details
- Strong work ethics, the can-do attitude and excellent interpersonal skills
- Handling Survey Scripting Process through the use of survey software platform such as Toluna, QuestionPro, Decipher.
- Mining large & complex data sets using SQL, Hadoop, NoSQL or Spark.
- Delivering complex consumer data analysis through the use of software like R, Python, Excel and etc such as
- Working on Basic Statistical Analysis such as:T-Test &Correlation
- Performing more complex data analysis processes through Machine Learning technique such as:
- Classification
- Regression
- Clustering
- Text
- Analysis
- Neural Networking
- Creating an Interactive Dashboard Creation through the use of software like Tableau or any other software you are able to use.
- Working on Statistical and mathematical modelling, application of ML and AI algorithms
What you need to have:
- Bachelor or Master's degree in highly quantitative field (CS, machine learning, mathematics, statistics, economics) or equivalent experience.
- An opportunity for one, who is eager of proving his or her data analytical skills with one of the Biggest FMCG market player.
Job Description
The applicant must have a minimum of 5 years of hands-on IT experience, working on a full software lifecycle in Agile mode.
Good to have experience in data modeling and/or systems architecture.
Responsibilities will include technical analysis, design, development and perform enhancements.
You will participate in all/most of the following activities:
- Working with business analysts and other project leads to understand requirements.
- Modeling and implementing database schemas in DB2 UDB or other relational databases.
- Designing, developing, maintaining and Data processing using Python, DB2, Greenplum, Autosys and other technologies
Skills /Expertise Required :
Work experience in developing large volume database (DB2/Greenplum/Oracle/Sybase).
Good experience in writing stored procedures, integration of database processing, tuning and optimizing database queries.
Strong knowledge of table partitions, high-performance loading and data processing.
Good to have hands-on experience working with Perl or Python.
Hands on development using Spark / KDB / Greenplum platform will be a strong plus.
Designing, developing, maintaining and supporting Data Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
Coming up with system architecture/re-design proposals for greater efficiency and ease of maintenance and developing software to turn proposals into implementations.
Need to work with business analysts and other project leads to understand requirements.
Strong collaboration and communication skills
Datametica is looking for talented SQL engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.
Mandatory Skills:
- Strong in SQL development
- Hands-on at least one scripting language - preferably shell scripting
- Development experience in Data warehouse projects
Opportunities:
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume, and KafkaWould get a chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
- Participate in planning, implementation of solutions, and transformation programs from legacy system to a cloud-based system
- Work with the team on Analysis, High level and low-level design for solutions using ETL or ELT based solutions and DB services in RDS
- Work closely with the architect and engineers to design systems that effectively reflect business needs, security requirements, and service level requirements
- Own deliverables related to design and implementation
- Own Sprint tasks and drive the team towards the goal while understanding the change and release process defined by the organization.
- Excellent communication skills, particularly those relating to complex findings and presenting them to ensure audience appeal at various levels of the organization
- Ability to integrate research and best practices into problem avoidance and continuous improvement
- Must be able to perform as an effective member in a team-oriented environment, maintain a positive attitude, and achieve desired results while working with minimal supervision
Basic Qualifications:
- Minimum of 5+ years of technical work experience in the implementation of complex, large scale, enterprise-wide projects including analysis, design, core development, and delivery
- Minimum of 3+ years of experience with expertise in Informatica ETL, Informatica Power Center, and Informatica Data Quality
- Experience with Informatica MDM tool is good to have
- Should be able to understand the scope of the work and ask for clarifications
- Should have advanced SQL skills. Including complex PL/SQL coding skills
- Knowledge of Agile is plus
- Well-versed with SOAP, Webservice, and REST API.
- Hand on development using Java would be a plus.