DATA ANALYST
About:
We allows customers to "buy now and pay later" for goods and services purchased online and offline portals. It's a rapidly growing organization opening up new avenues of payments for online and offline customers. |
Role:
Define and continuously refine the analytics roadmap. Build, Deploy and Maintain the data infrastructure that supports all of the analysis, including the data warehouse and various data marts Build, deploy and maintain the predictive models and scoring infrastructure that powers critical decision management systems. Strive to devise ways to gather more alternate data and build increasingly enhanced predictive models Partner with business teams to systematically design experiments to continuously improve customer acquisition, minimize churn, reduce delinquency and improve profitability Provide data insights to all business teams through automated queries, MIS, etc. |
Requirements:
4+ years of deep, hands-on analytics experience in a management consulting, start-up or financial services, or fintech company. Should have strong knowledge in SQL and Python. Deep knowledge of problem-solving approach using analytical frameworks. Deep knowledge of frameworks for data management, deployment, and monitoring of performance metrics. Hands-on exposure to delivering improvements through test and learn methodologies. Excellent communication and interpersonal skills, with the ability to be pleasantly persistent. |
Location-MUMBAI
About buy now & pay later
Similar jobs
Company Description
UpSolve is a Gen AI and Vision AI startup that helps businesses solve their problems by building custom solutions that drive strategic business decisions. Whether your business is facing time constraints or a lack of resources, UpSolve can help. We build enterprise grade AI solutions with focus on increasing ROI.
Role Description
This is a full-time hybrid role for a Business Analyst located in Mumbai.
Please note: This is an onsite role and good communication skills are expected (oral + written)
Responsibilities
1. Understand existing system integrations for the client.
2. Map and identify gaps in existing systems.
3. Ideate, Advise and Implement AI Solutions to optimize business process.
4. Collaborate with multiple teams and stakeholders.
Qualifications
- MBA with focus on Business Analytics or Bachelor's degree in Computer Science or IT
- Minimum 4 Years of Experience
- Strong written, verbal and collaboration skills
- Immediate Joiner (Less than 5 days)
Work Location: Mumbai, Work from Office
● Knowledge of Excel,SQL and writing code in python.
● Experience with Reporting and Business Intelligence tools like Tableau, Metabase.
● Exposure with distributed analytics processing technologies is desired (e.g. Hive, Spark).
● Experience with Clevertap, Mixpanel, Amplitude, etc.
● Excellent communication skills.
● Background in market research and project management.
● Attention to detail.
● Problem-solving aptitude.
Your Day-to-Day
- Derive Insights and drive major strategic projects to improve Business Metrics and take responsibility for cost efficiency and Revenue management across the country
- Perform Market research, Post Mortem analyses on competitor expansion and Market Penetration patterns.
- Provide in-depth business analysis and data insights for internal stakeholders to help improve business. Derive and launch projects in order to reduce the gaps between targeted and projected business metrics
- Responsible for optimizing Carsome’s C2B and B2C customer acquisition and Dealer retention funnel. Work closely with Marketing and Tech teams to create, produce and implement creative digital marketing campaigns and drive CRM initiatives and strategies
- Analyse the Revenue flows and processes large datasets to gather process insights and propose process improvement ideas for Carsome across SE-Asia
- Lead commercial projects & process mapping, from conceptualization to completion, to build or re-engineer business models, tools and processes.
- Having experience in analyses and insights in dealing on Unit Economics, COGs and P&L will be preferred ,but not mandatory
- Use Business Intelligence and Data Science tools to answer the appropriate business problems using SQL, Tableau or Python.
- Coordinate with HQ Data Insights Team and manage internal stakeholders across departments to ensure the smooth delivery of strategic projects
- Work across different departments/functions (BI,DE, tech, pricing, finance, operations, marketing, CS,CX) and also on high impact projects and support business expansion initiatives
Your Know-Know
- At least a Bachelor's Degree in Accounting/Finance/Business or the equivalent.
- 3-5 years of experience in strategy / consulting / analytical / project management roles; experience in e-commerce, Start-ups or Unicorns(CARS24,OLA,SWIGGY,FLIPKART,OYO) or entrepreneur experience preferred + At Least 2 years of experience leading a team
- Top-notch academics from a Tier 1 college (IIM / IIT/ NIT)
- Must have SQL/PostgreSQL/Tableau Experience.
- Excellent Market Research, reporting and analytical skills, including carrying out weekly and monthly reporting
- Holds experience in working with Data/Business Intelligence Team
- Analytical mindset with ability to present data in a structured and informative way
- Enjoy a fast-paced environment and can align business objectives with product priorities
- Good to have : Financial modelling, Developing financial forecasts , development of Financial - strategic plan/framework
Hi,
About the co.–Our client is an agency of the world’s largest media investment company which is a part of WPP. It is a global digital transformation agency with 1200 employees across 21 nations. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channels.
Job Location: Gurgaon/Bangalore
Responsibilities of the role:
Manage extraction of data sets from multiple marketing/database platforms and perform hygiene and quality control steps, either via Datorama or in partnership with Neo Technology team. Data sources will include: web analytic tools, media analytics, customer databases, social listening tools, search tools, syndicated data, research & survey tools, etc…
Implement and manage data system architecture Audit and manage data taxonomy/classifications from multiple systems and partners
Manage the extraction, loading, and transformation of multiple data sets
Cleanse all data and metrics; perform override updates where necessary
Execute all business rules according to requirements Identify and implement opportunities for efficiency throughout the process.
Manage and execute thorough QA process and ensure quality and accuracy of all data and reporting deliverables
Manipulate and analyze “big” data sets synthesized from a variety of sources, including media platforms and marketing automation tools
Generate and manage all data visualizations and ensure data is presented accurately and is visually pleasing
Assist analytics team in running numerous insights reports as needed Help maintain a performance platform and provide insights and ongoing recommendations around.
What you will need:
3+ years’ experience in an analytics position working with large amounts of data
Hands-on experience working with data visualization tools such as Datorama, Tableau, or PowerBI Additional desirable skills include tag management experience, application coding experience, statistics background
Digital media experience background preferred, including knowledge of Doubleclick and web analytics tools
2 hour overlap with NY or Chicago in the morning (EST or CST time zones)
Excellent communication skills
Regards
Team Merito
The Merck Data Engineering Team is responsible for designing, developing, testing, and supporting automated end-to-end data pipelines and applications on Merck’s data management and global analytics platform (Palantir Foundry, Hadoop, AWS and other components).
The Foundry platform comprises multiple different technology stacks, which are hosted on Amazon Web Services (AWS) infrastructure or on-premise Merck’s own data centers. Developing pipelines and applications on Foundry requires:
• Proficiency in SQL / Java / Python (Python required; all 3 not necessary)
• Proficiency in PySpark for distributed computation
• Familiarity with Postgres and ElasticSearch
• Familiarity with HTML, CSS, and JavaScript and basic design/visual competency
• Familiarity with common databases (e.g. JDBC, mySQL, Microsoft SQL). Not all types required
This position will be project based and may work across multiple smaller projects or a single large project utilizing an agile project methodology.
Roles & Responsibilities:
• Develop data pipelines by ingesting various data sources – structured and un-structured – into Palantir Foundry
• Participate in end to end project lifecycle, from requirements analysis to go-live and operations of an application
• Acts as business analyst for developing requirements for Foundry pipelines
• Review code developed by other data engineers and check against platform-specific standards, cross-cutting concerns, coding and configuration standards and functional specification of the pipeline
• Document technical work in a professional and transparent way. Create high quality technical documentation
• Work out the best possible balance between technical feasibility and business requirements (the latter can be quite strict)
• Deploy applications on Foundry platform infrastructure with clearly defined checks
• Implementation of changes and bug fixes via Merck's change management framework and according to system engineering practices (additional training will be provided)
• DevOps project setup following Agile principles (e.g. Scrum)
• Besides working on projects, act as third level support for critical applications; analyze and resolve complex incidents/problems. Debug problems across a full stack of Foundry and code based on Python, Pyspark, and Java
• Work closely with business users, data scientists/analysts to design physical data models
- Designing and coding the data warehousing system to desired company specifications
- Conducting preliminary testing of the warehousing environment before data is extracted
- Extracting company data and transferring it into the new warehousing environment
- Testing the new storage system once all the data has been transferred
- Troubleshooting any issues that may arise
- Providing maintenance support
- Consulting with data management teams to get a big-picture idea of the company’s data storage needs
- Presenting the company with warehousing options based on their storage needs
- Experience of 1-3 years in Informatica Power Center
- Excellent knowledge in Oracle database and Pl-SQL such - Stored Procs, Functions, User Defined Functions, table partition, Index, views etc.
- Knowledge of SQL Server database
- Hands on experience in Informatica Power Center and Database performance tuning, optimization including complex Query optimization techniques Understanding of ETL Control Framework
- Experience in UNIX shell/Perl Scripting
- Good communication skills, including the ability to write clearly
- Able to function effectively as a member of a team
- Proactive with respect to personal and technical development
- Excellent working knowledge on Data Warehousing /Data Migration activity using an ETL tool.
- Strong Data Integration, PostgreSQL/Oracle Database skills, Shell Scripting, Python programming, and development know-how.
- Hands-on experience in working with and generating XML documents.
- Good analytical and business process understanding capability.
- Familiarized with Data Models, Source-Target Data Mapping, Transactional, and Master Data concepts.
- Well-experienced in High level/Detailed design, Performance tuning of ETL jobs.
- Very good communication skills, interpersonal skills, stakeholder management skills, self-motivated, quick learner, team player.
- Exposure to After Sales Business Domain is highly preferred.
- Experience using HP ALM, Jira for ticketing.
- Experience release management