9+ Data Warehouse (DWH) Jobs in Mumbai | Data Warehouse (DWH) Job openings in Mumbai
Apply to 9+ Data Warehouse (DWH) Jobs in Mumbai on CutShort.io. Explore the latest Data Warehouse (DWH) Job opportunities across top companies like Google, Amazon & Adobe.
Experience: 12-15 Years with 7 years in Big Data, Cloud, and Analytics.
Key Responsibilities:
- Technical Project Management:
- o Lead the end-to-end technical delivery of multiple projects in Big Data, Cloud, and Analytics. Lead teams in technical solutioning, design and development
- o Develop detailed project plans, timelines, and budgets, ensuring alignment with client expectations and business goals.
- o Monitor project progress, manage risks, and implement corrective actions as needed to ensure timely and quality delivery.
- Client Engagement and Stakeholder Management:
- o Build and maintain strong client relationships, acting as the primary point of contact for project delivery.
- o Understand client requirements, anticipate challenges, and provide proactive solutions.
- o Coordinate with internal and external stakeholders to ensure seamless project execution.
- o Communicate project status, risks, and issues to senior management and stakeholders in a clear and timely manner.
- Team Leadership:
- o Lead and mentor a team of data engineers, analysts, and project managers.
- o Ensure effective resource allocation and utilization across projects.
- o Foster a culture of collaboration, continuous improvement, and innovation within the team.
- Technical and Delivery Excellence:
- o Leverage Data Management Expertise and Experience to guide and lead the technical conversations effectively. Identify and understand technical areas of support needed to the team and work towards resolving them – either by own expertise or networking with internal and external stakeholders to unblock the team
- o Implement best practices in project management, delivery, and quality assurance.
- o Drive continuous improvement initiatives to enhance delivery efficiency and client satisfaction.
- o Stay updated with the latest trends and advancements in Big Data, Cloud, and Analytics technologies.
Requirements:
- Experience in IT delivery management, particularly in Big Data, Cloud, and Analytics.
- Strong knowledge of project management methodologies and tools (e.g., Agile, Scrum, PMP).
- Excellent leadership, communication, and stakeholder management skills.
- Proven ability to manage large, complex projects with multiple stakeholders.
- Strong critical thinking skills and the ability to make decisions under pressure.
Preferred Qualifications:
- Bachelor’s degree in computer science, Information Technology, or a related field.
- Relevant certifications in Big Data, Cloud platforms like GCP, Azure, AWS, Snowflake, Databricks, Project Management or similar areas is preferred.
1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.
2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose
3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met
4. Prioritization of issues to meet deadlines while ensuring high-quality delivery
5. Ability to pull data and to perform ad hoc reporting and analysis as needed
6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities
7. Strong interpersonal and presentation skills
SKILLS:
1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel
2. Experience working with senior decision-makers
3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement
4. Good Knowledge and experience in Excel VBA and advanced excel
5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements
6. Strong communication/interpersonal skills
PERSONA:
1. Experience in working on adhoc requirements
2. Ability to toggle around with shifting priorities
3. Experience in working for Fintech or E-commerce industry is preferable
4. Engineering 2+ years of experience as a Business Analyst for the finance processes
DATA ENGINEER
Overview
They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.
Job Description:
We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.
Responsibilities:
Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.
Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.
Optimize and tune the performance of data systems to ensure efficient data processing and analysis.
Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.
Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.
Implement and maintain data governance and security measures to protect sensitive data.
Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.
Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.
Qualifications:
Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.
Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.
Strong programming skills in languages such as Python, Java, or Scala.
Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.
Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).
Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).
Solid understanding of data modeling, data warehousing, and ETL principles.
Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).
Strong problem-solving and analytical skills, with the ability to handle complex data challenges.
Excellent communication and collaboration skills to work effectively in a team environment.
Preferred Qualifications:
Advanced knowledge of distributed computing and parallel processing.
Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).
Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).
Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
Experience with data visualization and reporting tools (e.g., Tableau, Power BI).
Certification in relevant technologies or data engineering disciplines.
Role: Project Manager
Experience: 8-10 Years
Location: Mumbai
Company Profile:
Exponentia.ai is an AI tech organization with a presence across India, Singapore, the Middle East, and the UK. We are an innovative and disruptive organization, working on cutting-edge technology to help our clients transform into the enterprises of the future. We provide artificial intelligence-based products/platforms capable of automated cognitive decision-making to improve productivity, quality, and economics of the underlying business processes. Currently, we are rapidly expanding across machine learning, Data Engineering and Analytics functions. Exponentia.ai has developed long-term relationships with world-class clients such as PayPal, PayU, SBI Group, HDFC Life, Kotak Securities, Wockhardt and Adani Group amongst others.
One of the top partners of Data bricks, Azure, Cloudera (leading analytics player) and Qlik (leader in BI technologies), Exponentia.ai has recently been awarded the ‘Innovation Partner Award’ by Qlik and "Excellence in Business Process Automation Award" (IMEA) by Automation Anywhere.
Get to know more about us at http://www.exponentia.ai and https://in.linkedin.com/company/exponentiaai
Role Overview:
· Project manager shall be responsible to oversee and take responsibility for the successful delivery of a range of projects in Business Intelligence, Data warehousing, and Analytics/AI-ML.
· Project manager is expected to manage the project and lead the teams of BI engineers, data engineers, data scientists and application developers.
Job Responsibilities:
· Efforts estimation, creating a project plan, planning milestones, activities and tracking the progress.
· Identify risks and issues. Come up with a mitigation plan.
· Status reporting to both internal and external stakeholders.
· Communicate with all stakeholders.
· Manage end-to-end project lifecycle - requirements gathering, design, development, testing and go-live.
· Manage end-to-end BI or data warehouse projects.
· Must have experience in running Agile-based project development.
Technical skills
· Experience in Business Intelligence Data warehousing or Analytics projects.
· Understand data lake and data warehouse solutions including ETL pipelines.
· Good to have - Knowledge of Azure blob storage, azure data factory and Synapse analytics.
· Good to have - Knowledge of Qlik Sense or Power BI
· Good to have - Certified in PMP/Prince 2 / Agile Project management.
· Excellent written and verbal communication skills.
Education:
MBA, B.E. or B. Tech. or MCA degree
Synergetic IT Services India Pvt Ltd
2. Responsible for gathering system requirements working together with application architects
and owners
3. Responsible for generating scripts and templates required for the automatic provisioning of
resources
4. Discover standard cloud services offerings, install, and execute processes and standards for
optimal use of cloud service provider offerings
5. Incident Management on IaaS, PaaS, SaaS.
6. Responsible for debugging technical issues inside a complex stack involving virtualization,
containers, microservices, etc.
7. Collaborate with the engineering teams to enable their applications to run
on Cloud infrastructure.
8. Experience with OpenStack, Linux, Amazon Web Services, Microsoft Azure, DevOps, NoSQL
etc will be plus.
9. Design, implement, configure, and maintain various Azure IaaS, PaaS, SaaS services.
10. Deploy and maintain Azure IaaS Virtual Machines and Azure Application and Networking
Services.
11. Optimize Azure billing for cost/performance (VM optimization, reserved instances, etc.)
12. Implement, and fully document IT projects.
13. Identify improvements to IT documentation, network architecture, processes/procedures,
and tickets.
14. Research products and new technologies to increase efficiency of business and operations
15. Keep all tickets and projects updated and track time in a detailed format
16. Should be able to multi-task and work across a range of projects and issues with various
timelines and priorities
Technical:
• Minimum 1 year experience Azure and knowledge on Office365 services preferred.
• Formal education in IT preferred
• Experience with Managed Service business model a major plus
• Bachelor’s degree preferred
4-6 years of total experience in data warehousing and business intelligence
3+ years of solid Power BI experience (Power Query, M-Query, DAX, Aggregates)
2 years’ experience building Power BI using cloud data (Snowflake, Azure Synapse, SQL DB, data lake)
Strong experience building visually appealing UI/UX in Power BI
Understand how to design Power BI solutions for performance (composite models, incremental refresh, analysis services)
Experience building Power BI using large data in direct query mode
Expert SQL background (query building, stored procedure, optimizing performance)
A global business process management company
Designation – Deputy Manager - TS
Job Description
- Total of 8/9 years of development experience Data Engineering . B1/BII role
- Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
- Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
- Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
- Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
- Strong Python skill .
- Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
- Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
- Life Science & Healthcare domain background will be a plus
Qualifications
BE/Btect/ME/MTech
Blenheim Chalcot IT Services India Pvt Ltd
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.