Cutshort logo
Data Warehouse (DWH) Jobs in Mumbai

9+ Data Warehouse (DWH) Jobs in Mumbai | Data Warehouse (DWH) Job openings in Mumbai

Apply to 9+ Data Warehouse (DWH) Jobs in Mumbai on CutShort.io. Explore the latest Data Warehouse (DWH) Job opportunities across top companies like Google, Amazon & Adobe.

icon
Smartavya Analytica

Smartavya Analytica

Agency job
via Pluginlive by Joslyn Gomes
Mumbai
10 - 15 yrs
₹15L - ₹25L / yr
Datawarehousing
Data Warehouse (DWH)
ETL
Data Visualization
Big Data
+7 more

Experience: 12-15 Years with 7 years in Big Data, Cloud, and Analytics. 

Key Responsibilities: 

  • Technical Project Management:
  • o Lead the end-to-end technical delivery of multiple projects in Big Data, Cloud, and Analytics. Lead teams in technical solutioning, design and development
  • o Develop detailed project plans, timelines, and budgets, ensuring alignment with client expectations and business goals.
  • o Monitor project progress, manage risks, and implement corrective actions as needed to ensure timely and quality delivery.
  • Client Engagement and Stakeholder Management:
  • o Build and maintain strong client relationships, acting as the primary point of contact for project delivery.
  • o Understand client requirements, anticipate challenges, and provide proactive solutions.
  • o Coordinate with internal and external stakeholders to ensure seamless project execution.
  • o Communicate project status, risks, and issues to senior management and stakeholders in a clear and timely manner.
  • Team Leadership:
  • o Lead and mentor a team of data engineers, analysts, and project managers.
  • o Ensure effective resource allocation and utilization across projects.
  • o Foster a culture of collaboration, continuous improvement, and innovation within the team.
  • Technical and Delivery Excellence:
  • o Leverage Data Management Expertise and Experience to guide and lead the technical conversations effectively. Identify and understand technical areas of support needed to the team and work towards resolving them – either by own expertise or networking with internal and external stakeholders to unblock the team
  • o Implement best practices in project management, delivery, and quality assurance.
  • o Drive continuous improvement initiatives to enhance delivery efficiency and client satisfaction.
  • o Stay updated with the latest trends and advancements in Big Data, Cloud, and Analytics technologies.

Requirements:

  • Experience in IT delivery management, particularly in Big Data, Cloud, and Analytics.
  • Strong knowledge of project management methodologies and tools (e.g., Agile, Scrum, PMP).
  • Excellent leadership, communication, and stakeholder management skills.
  • Proven ability to manage large, complex projects with multiple stakeholders.
  • Strong critical thinking skills and the ability to make decisions under pressure.

Preferred Qualifications:

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • Relevant certifications in Big Data, Cloud platforms like GCP, Azure, AWS, Snowflake, Databricks, Project Management or similar areas is preferred.
Read more
Fatakpay

at Fatakpay

2 recruiters
Ajit Kumar
Posted by Ajit Kumar
Mumbai
2 - 4 yrs
₹8L - ₹12L / yr
SQL
skill iconPython
Problem solving
Data Warehouse (DWH)
Excel VBA

1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.


2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose


3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met


4. Prioritization of issues to meet deadlines while ensuring high-quality delivery


5. Ability to pull data and to perform ad hoc reporting and analysis as needed


6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities


7. Strong interpersonal and presentation skills


SKILLS:


1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel


2. Experience working with senior decision-makers


3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement


4. Good Knowledge and experience in Excel VBA and advanced excel


5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements


6. Strong communication/interpersonal skills


PERSONA:


1. Experience in working on adhoc requirements


2. Ability to toggle around with shifting priorities


3. Experience in working for Fintech or E-commerce industry is preferable


4. Engineering 2+ years of experience as a Business Analyst for the finance processes

Read more
Personal Care Product Manufacturing

Personal Care Product Manufacturing

Agency job
via Qrata by Rayal Rajan
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
Exponentia.ai

at Exponentia.ai

1 product
1 recruiter
Vipul Tiwari
Posted by Vipul Tiwari
Mumbai
7 - 10 yrs
₹13L - ₹19L / yr
Project Management
IT project management
Software project management
Business Intelligence (BI)
Data Warehouse (DWH)
+8 more

Role: Project Manager

Experience: 8-10 Years

Location: Mumbai


Company Profile:



Exponentia.ai is an AI tech organization with a presence across India, Singapore, the Middle East, and the UK. We are an innovative and disruptive organization, working on cutting-edge technology to help our clients transform into the enterprises of the future. We provide artificial intelligence-based products/platforms capable of automated cognitive decision-making to improve productivity, quality, and economics of the underlying business processes. Currently, we are rapidly expanding across machine learning, Data Engineering and Analytics functions. Exponentia.ai has developed long-term relationships with world-class clients such as PayPal, PayU, SBI Group, HDFC Life, Kotak Securities, Wockhardt and Adani Group amongst others.

One of the top partners of Data bricks, Azure, Cloudera (leading analytics player) and Qlik (leader in BI technologies), Exponentia.ai has recently been awarded the ‘Innovation Partner Award’ by Qlik and "Excellence in Business Process Automation Award" (IMEA) by Automation Anywhere.

Get to know more about us at http://www.exponentia.ai and https://in.linkedin.com/company/exponentiaai 


Role Overview:


·        Project manager shall be responsible to oversee and take responsibility for the successful delivery of a range of projects in Business Intelligence, Data warehousing, and Analytics/AI-ML.

·        Project manager is expected to manage the project and lead the teams of BI engineers, data engineers, data scientists and application developers.


Job Responsibilities:


·        Efforts estimation, creating a project plan, planning milestones, activities and tracking the progress.

·        Identify risks and issues. Come up with a mitigation plan.

·        Status reporting to both internal and external stakeholders.

·        Communicate with all stakeholders.

·        Manage end-to-end project lifecycle - requirements gathering, design, development, testing and go-live.

·        Manage end-to-end BI or data warehouse projects.

·        Must have experience in running Agile-based project development.


Technical skills


·        Experience in Business Intelligence Data warehousing or Analytics projects.

·        Understand data lake and data warehouse solutions including ETL pipelines.

·        Good to have - Knowledge of Azure blob storage, azure data factory and Synapse analytics.

·        Good to have - Knowledge of Qlik Sense or Power BI

·        Good to have - Certified in PMP/Prince 2 / Agile Project management.

·        Excellent written and verbal communication skills.


 Education:

MBA, B.E. or B. Tech. or MCA degree

Read more
Synergetic IT Services India Pvt Ltd

Synergetic IT Services India Pvt Ltd

Agency job
via Milestone Hr Consultancy by Haina khan
Mumbai
2 - 5 yrs
₹2L - ₹8L / yr
Data Warehouse (DWH)
Informatica
ETL
Microsoft Windows Azure
Big Data
+1 more
Responsible for the evaluation of cloud strategy and program architecture
2. Responsible for gathering system requirements working together with application architects
and owners
3. Responsible for generating scripts and templates required for the automatic provisioning of
resources
4. Discover standard cloud services offerings, install, and execute processes and standards for
optimal use of cloud service provider offerings
5. Incident Management on IaaS, PaaS, SaaS.
6. Responsible for debugging technical issues inside a complex stack involving virtualization,
containers, microservices, etc.
7. Collaborate with the engineering teams to enable their applications to run
on Cloud infrastructure.
8. Experience with OpenStack, Linux, Amazon Web Services, Microsoft Azure, DevOps, NoSQL
etc will be plus.
9. Design, implement, configure, and maintain various Azure IaaS, PaaS, SaaS services.
10. Deploy and maintain Azure IaaS Virtual Machines and Azure Application and Networking
Services.
11. Optimize Azure billing for cost/performance (VM optimization, reserved instances, etc.)
12. Implement, and fully document IT projects.
13. Identify improvements to IT documentation, network architecture, processes/procedures,
and tickets.
14. Research products and new technologies to increase efficiency of business and operations
15. Keep all tickets and projects updated and track time in a detailed format
16. Should be able to multi-task and work across a range of projects and issues with various
timelines and priorities
Technical:
• Minimum 1 year experience Azure and knowledge on Office365 services preferred.
• Formal education in IT preferred
• Experience with Managed Service business model a major plus
• Bachelor’s degree preferred
Read more
Accion Labs

at Accion Labs

14 recruiters
Anjali Mohandas
Posted by Anjali Mohandas
Remote, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai
4 - 8 yrs
₹15L - ₹28L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more

4-6 years of total experience in data warehousing and business intelligence

3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)

2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)

Strong experience building visually appealing UI/UX in Power BI

Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)

Experience building Power BI using large data in direct query mode

Expert SQL background (query building, stored procedure, optimizing performance)

Read more
A global business process management company

A global business process management company

Agency job
via Jobdost by Saida Jabbar
Gurugram, Pune, Mumbai, Bengaluru (Bangalore), Chennai, Nashik
4 - 12 yrs
₹12L - ₹15L / yr
Data engineering
Data modeling
data pipeline
Data integration
Data Warehouse (DWH)
+12 more

 

 

Designation – Deputy Manager - TS


Job Description

  1. Total of  8/9 years of development experience Data Engineering . B1/BII role
  2. Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
  3. Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
  4. Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
  5. Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
  6. Strong Python skill .
  7. Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
  8. Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
  9. Life Science & Healthcare domain background will be a plus

Qualifications

BE/Btect/ME/MTech

 

Read more
Blenheim Chalcot IT Services India Pvt Ltd

Blenheim Chalcot IT Services India Pvt Ltd

Agency job
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Paysense Services India Pvt Ltd
Pragya Singh
Posted by Pragya Singh
NCR (Delhi | Gurgaon | Noida), Mumbai
2 - 7 yrs
₹10L - ₹30L / yr
skill iconPython
skill iconData Analytics
Hadoop
Data Warehouse (DWH)
skill iconMachine Learning (ML)
About the job: - You will work with data scientists to architect, code and deploy ML models - You will solve problems of storing and analyzing large scale data in milliseconds - architect and develop data processing and warehouse systems - You will code, drink, breathe and live python, sklearn and pandas. It’s good to have experience in these but not a necessity - as long as you’re super comfortable in a language of your choice. - You will develop tools and products that provide analysts ready access to the data About you: - Strong CS fundamentals - You have strong experience in working with production environments - You write code that is clean, readable and tested - Instead of doing it second time, you automate it - You have worked with some of the commonly used databases and computing frameworks (Psql, S3, Hadoop, Hive, Presto, Spark, etc) - It will be great if you have one of the following to share - a kaggle or a github profile - You are an expert in one or more programming languages (Python preferred). Also good to have experience with python-based application development and data science libraries. - Ideally, you have 2+ years of experience in tech and/or data. - Degree in CS/Maths from Tier-1 institutes.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort