2+ ELT Jobs in Mumbai | ELT Job openings in Mumbai
Apply to 2+ ELT Jobs in Mumbai on CutShort.io. Explore the latest ELT Job opportunities across top companies like Google, Amazon & Adobe.
About The Role
- As a Data Platform Lead, you will utilize your strong technical background and hands-on development skills to design, develop, and maintain data platforms.
- Leading a team of skilled data engineers, you will create scalable and robust data solutions that enhance business intelligence and decision-making.
- You will ensure the reliability, efficiency, and scalability of data systems while mentoring your team to achieve excellence.
- Collaborating closely with our client’s CXO-level stakeholders, you will oversee pre-sales activities, solution architecture, and project execution.
- Your ability to stay ahead of industry trends and integrate the latest technologies will be crucial in maintaining our competitive edge.
Key Responsibilities
- Client-Centric Approach: Understand client requirements deeply and translate them into robust technical specifications, ensuring solutions meet their business needs.
- Architect for Success: Design scalable, reliable, and high-performance systems that exceed client expectations and drive business success.
- Lead with Innovation: Provide technical guidance, support, and mentorship to the development team, driving the adoption of cutting-edge technologies and best practices.
- Champion Best Practices: Ensure excellence in software development and IT service delivery, constantly assessing and evaluating new technologies, tools, and platforms for project suitability.
- Be the Go-To Expert: Serve as the primary point of contact for clients throughout the project lifecycle, ensuring clear communication and high levels of satisfaction.
- Build Strong Relationships: Cultivate and manage relationships with CxO/VP level stakeholders, positioning yourself as a trusted advisor.
- Deliver Excellence: Manage end-to-end delivery of multiple projects, ensuring timely and high-quality outcomes that align with business goals.
- Report with Clarity: Prepare and present regular project status reports to stakeholders, ensuring transparency and alignment.
- Collaborate Seamlessly: Coordinate with cross-functional teams to ensure smooth and efficient project execution, breaking down silos and fostering collaboration.
- Grow the Team: Provide timely and constructive feedback to support the professional growth of team members, creating a high-performance culture.
Qualifications
- Master’s (M. Tech., M.S.) in Computer Science or equivalent from reputed institutes like IIT, NIT preferred
- Overall 6–8 years of experience with minimum 2 years of relevant experience and a strong technical background
- Experience working in mid size IT Services company is preferred
Preferred Certification
- AWS Certified Data Analytics Specialty
- AWS Solution Architect Professional
- Azure Data Engineer + Solution Architect
- Databricks Certified Data Engineer / ML Professional
Technical Expertise
- Advanced knowledge of distributed architectures and data modeling practices.
- Extensive experience with Data Lakehouse systems like Databricks and data warehousing solutions such as Redshift and Snowflake.
- Hands-on experience with data technologies such as Apache Spark, SQL, Airflow, Kafka, Jenkins, Hadoop, Flink, Hive, Pig, HBase, Presto, and Cassandra.
- Knowledge in BI tools including PowerBi, Tableau, Quicksight and open source equivalent like Superset and Metabase is good to have.
- Strong knowledge of data storage formats including Iceberg, Hudi, and Delta.
- Proficient programming skills in Python, Scala, Go, or Java.
- Ability to architect end-to-end solutions from data ingestion to insights, including designing data integrations using ETL and other data integration patterns.
- Experience working with multi-cloud environments, particularly AWS and Azure.
- Excellent teamwork and communication skills, with the ability to thrive in a fast-paced, agile environment.
About Oneture Technologies
Oneture Technologies is a cloud-first digital engineering company helping enterprises and high-growth startups build modern, scalable, and data-driven solutions. Our teams work on cutting-edge big data, cloud, analytics, and platform engineering engagements where ownership, innovation, and continuous learning are core values.
Role Overview
We are looking for an experienced Data Engineer with 2-4 years of hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate must have strong expertise in PySpark and exposure to real-time or streaming frameworks such as Apache Flink. You will work closely with architects, data scientists, and product teams to design and deliver robust, high-performance data solutions.
Key Responsibilities
- Design, develop, and maintain scalable ETL/ELT data pipelines using PySpark
- Implement real-time or near real-time data processing using Apache Flink
- Optimize data workflows for performance, scalability, and reliability
- Work with large-scale data platforms and distributed environments
- Collaborate with cross-functional teams to integrate data solutions into products and analytics platforms
- Ensure data quality, integrity, and governance across pipelines
- Conduct performance tuning, debugging, and root-cause analysis of data processes
- Write clean, modular, and well-documented code following best engineering practices
Primary Skills
- Strong hands-on experience in PySpark (RDD, DataFrame API, Spark SQL)
- Experience with Apache Flink, Spark or Kafka (streaming or batch)
- Solid understanding of distributed computing concepts
- Proficiency in Python for data engineering workflows
- Strong SQL skills for data manipulation and transformation
- Experience with data pipeline orchestration tools (Airflow, Step Functions, etc.)
Secondary Skills
- Experience with cloud platforms (AWS, Azure, or GCP)
- Knowledge of data lakes, lakehouse architectures, and modern data stack tools
- Familiarity with Delta Lake, Iceberg, or Hudi
- Experience with CI/CD pipelines for data workflows
- Understanding of messaging and streaming systems (Kafka, Kinesis)
- Knowledge of DevOps and containerization tools (Docker)
Soft Skills
- Strong analytical and problem-solving capabilities
- Ability to work independently and as part of a collaborative team
- Good communication and documentation skills
- Ownership mindset with a willingness to learn and adapt
Education
- Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or a related field
Why Join Oneture Technologies?
- Opportunity to work on high-impact, cloud-native data engineering projects
- Collaborative team environment with a strong learning culture
- Exposure to modern data platforms, scalable architectures, and real-time data systems
- Growth-oriented role with hands-on ownership across end-to-end data engineering initiatives

