Cutshort logo
Etl management jobs

7+ ETL management Jobs in India

Apply to 7+ ETL management Jobs on CutShort.io. Find your next job, effortlessly. Browse ETL management Jobs and apply today!

icon
MyOperator - VoiceTree Technologies

at MyOperator - VoiceTree Technologies

1 video
3 recruiters
Vijay Muthu
Posted by Vijay Muthu
Remote only
3.5 - 5 yrs
₹14L - ₹20L / yr
skill iconPython
skill iconDjango
MySQL
skill iconPostgreSQL
FastAPI
+22 more

About Us:

MyOperator is a Business AI Operator and a category leader that unifies WhatsApp, Calls, and AI-powered chat & voice bots into one intelligent business communication platform.

Unlike fragmented communication tools, MyOperator combines automation, intelligence, and workflow integration to help businesses run WhatsApp campaigns, manage calls, deploy AI chatbots, and track performance — all from a single, no-code platform.Trusted by 12,000+ brands including Amazon, Domino’s, Apollo, and Razorpay, MyOperator enables faster responses, higher resolution rates, and scalable customer engagement — without fragmented tools or increased headcount.


Role Overview:

We’re seeking a passionate Python Developer with strong experience in backend development and cloud infrastructure. This role involves building scalable microservices, integrating AI tools like LangChain/LLMs, and optimizing backend performance for high-growth B2B products.


Key Responsibilities:

  • Develop robust backend services using Python, Django, and FastAPI
  • Design and maintain a scalable microservices architecture
  • Integrate LangChain/LLMs into AI-powered features
  • Write clean, tested, and maintainable code with pytest
  • Manage and optimize databases (MySQL/Postgres)
  • Deploy and monitor services on AWS
  • Collaborate across teams to define APIs, data flows, and system architecture

Must-Have Skills:

  • Python and Django
  • MySQL or Postgres
  • Microservices architecture
  • AWS (EC2, RDS, Lambda, etc.)
  • Unit testing using pytest
  • LangChain or Large Language Models (LLM)
  • Strong grasp of Data Structures & Algorithms
  • AI coding assistant tools (e.g., Chat GPT & Gemini)

Good to Have:

  • MongoDB or ElasticSearch
  • Go or PHP
  • FastAPI
  • React, Bootstrap (basic frontend support)
  • ETL pipelines, Jenkins, Terraform

Why Join Us?

  • 100% Remote role with a collaborative team
  • Work on AI-first, high-scale SaaS products
  • Drive real impact in a fast-growing tech company
  • Ownership and growth from day one


Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Kochi (Cochin), Trivandrum
4 - 6 yrs
₹11L - ₹17L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Data engineering
SQL
ETL
+22 more

JOB DETAILS:

* Job Title: Associate III - Data Engineering

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 4-6 years

* Location: Trivandrum, Kochi

Job Description

Job Title:

Data Services Engineer – AWS & Snowflake

 

Job Summary:

As a Data Services Engineer, you will be responsible for designing, developing, and maintaining robust data solutions using AWS cloud services and Snowflake.

You will work closely with cross-functional teams to ensure data is accessible, secure, and optimized for performance.

Your role will involve implementing scalable data pipelines, managing data integration, and supporting analytics initiatives.

 

Responsibilities:

• Design and implement scalable and secure data pipelines on AWS and Snowflake (Star/Snowflake schema)

• Optimize query performance using clustering keys, materialized views, and caching

• Develop and maintain Snowflake data warehouses and data marts.

• Build and maintain ETL/ELT workflows using Snowflake-native features (Snowpipe, Streams, Tasks).

• Integrate Snowflake with cloud platforms (AWS, Azure, GCP) and third-party tools (Airflow, dbt, Informatica)

• Utilize Snowpark and Python/Java for complex transformations

• Implement RBAC, data masking, and row-level security.

• Optimize data storage and retrieval for performance and cost-efficiency.

• Collaborate with stakeholders to gather data requirements and deliver solutions.

• Ensure data quality, governance, and compliance with industry standards.

• Monitor, troubleshoot, and resolve data pipeline and performance issues.

• Document data architecture, processes, and best practices.

• Support data migration and integration from various sources.

 

Qualifications:

• Bachelor’s degree in Computer Science, Information Technology, or a related field.

• 3 to 4 years of hands-on experience in data engineering or data services.

• Proven experience with AWS data services (e.g., S3, Glue, Redshift, Lambda).

• Strong expertise in Snowflake architecture, development, and optimization.

• Proficiency in SQL and Python for data manipulation and scripting.

• Solid understanding of ETL/ELT processes and data modeling.

• Experience with data integration tools and orchestration frameworks.

• Excellent analytical, problem-solving, and communication skills.

 

Preferred Skills:

• AWS Glue, AWS Lambda, Amazon Redshift

• Snowflake Data Warehouse

• SQL & Python

 

Skills: Aws Lambda, AWS Glue, Amazon Redshift, Snowflake Data Warehouse

 

Must-Haves

AWS data services (4-6 years), Snowflake architecture (4-6 years), SQL (proficient), Python (proficient), ETL/ELT processes (solid understanding)

Skills: AWS, AWS lambda, Snowflake, Data engineering, Snowpipe, Data integration tools, orchestration framework

Relevant 4 - 6 Years

python is mandatory

 

******

Notice period - 0 to 15 days only (Feb joiners’ profiles only)

Location: Kochi

F2F Interview 7th Feb

 

 

Read more
MindCrew Technologies

at MindCrew Technologies

3 recruiters
Agency job
Pune
8 - 12 yrs
₹10L - ₹15L / yr
Data engineering
Data modeling
Snow flake schema
ETL
ETL architecture
+3 more

Job Title: Lead Data Engineer

📍 Location: Pune

🧾 Experience: 10+ Years

💰 Budget: Up to 1.7 LPM


Responsibilities

  • Collaborate with Data & ETL teams to review, optimize, and scale data architectures within Snowflake.
  • Design, develop, and maintain efficient ETL/ELT pipelines and robust data models.
  • Optimize SQL queries for performance and cost efficiency.
  • Ensure data quality, reliability, and security across pipelines and datasets.
  • Implement Snowflake best practices for performance, scaling, and governance.
  • Participate in code reviews, knowledge sharing, and mentoring within the data engineering team.
  • Support BI and analytics initiatives by enabling high-quality, well-modeled datasets.


Read more
Enqubes

Enqubes

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Remote only
7 - 10 yrs
₹10L - ₹15L / yr
SAP BODS
SAP HANA
HANA
ETL management
ETL
+3 more

Job Title: SAP BODS Developer

  • Experience: 7–10 Years
  • Location: Remote (India-based candidates only)
  • Employment Type: Permanent (Full-Time)
  • Salary Range: ₹20 – ₹25 LPA (Fixed CTC)


Required Skills & Experience:

- 7–10 years of hands-on experience as a SAP BODS Developer.

- Strong experience in S/4HANA implementation or upgrade projects with large-scale data migration.

- Proficient in ETL development, job optimization, and performance tuning using SAP BODS.

- Solid understanding of SAP data structures (FI, MM, SD, etc.) from a technical perspective.

- Skilled in SQL scripting, error resolution, and job monitoring.

- Comfortable working independently in a remote, spec-driven development environment.


Read more
Metric Vibes

Metric Vibes

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Noida
4 - 8 yrs
₹10L - ₹15L / yr
PowerBI
skill iconJavascript
RESTful APIs
Embedded software
SQL
+9 more

Job Title: Tableau BI Developer

Years of Experience: 4-8Yrs

12$ per hour fte engagement

8 hrs. working


Required Skills & Experience:

✅ 4–8 years of experience in BI development and data engineering

✅ Expertise in BigQuery and/or Snowflake for large-scale data processing

✅ Strong SQL skills with experience writing complex analytical queries

✅ Experience in creating dashboards in tools like Power BI, Looker, or similar

✅ Hands-on experience with ETL/ELT tools and data pipeline orchestration

✅ Familiarity with cloud platforms (GCP, AWS, or Azure)

✅ Strong understanding of data modeling, data warehousing, and analytics best practices

✅ Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders

Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
Rani Galipalli
Posted by Rani Galipalli
Bengaluru (Bangalore), Pune, Mumbai
6 - 8 yrs
₹25L - ₹28L / yr
ETL
Informatica
Data Warehouse (DWH)
ETL management
SQL
+1 more

Your key responsibilities

 

  • Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
  • Responsible for development, support, maintenance, and implementation of a complex project module
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
  • complete reporting solutions.
  • Preparation of HLD about architecture of the application and high level design.
  • Preparation of LLD about job design, job description and in detail information of the jobs.
  • Preparation of Unit Test cases and execution of the same.
  • Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle

Skills and attributes for success

 

  • Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
  • Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
  • Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
  • Should have enough experience to work on Power Shell Scripting
  • Able to guide the team through the development, testing and implementation stages and review the completed work effectively
  • Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
  • Primary owner of delivery, timelines. Review code was written by other engineers.
  • Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
  • Must have understanding of business intelligence development in the IT industry
  • Outstanding written and verbal communication skills
  • Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
  • Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
  • Should be able to orchestrate and automate pipeline
  • Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark

 

To qualify for the role, you must have

 

  • Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
  • More than 6 years of experience in ETL development projects
  • Proven experience in delivering effective technical ETL strategies
  • Microsoft Azure project experience
  • Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)

 

Ideally, you’ll also have

Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Remote only
5 - 8 yrs
₹9L - ₹14L / yr
ETL
Informatica
Data Warehouse (DWH)
API
Microsoft Windows Azure
+2 more
Responsible for the design of solutions using Azure Data Integration Services. This will include
 
Solution design using Microsoft Azure services and related tools.
Design of enterprise data models and Data Warehouse solutions.
Specification of ETL pipelines, data integration and data migration design.
Design & implementation of Master data management solutions.
Specification of Data Quality Management methodology and supporting technology tools.
Working within a project management/agile delivery methodology in a leading role as part of a wider team.
Deal with other stakeholders/ end users in the software development lifecycle – Scrum Masters, Product Owners, Data Architects, and testing teams.
 
Specific Technologies within the Microsoft Azure Cloud stack include:
 
API Development and APIM registration of REST/SOAP API's.
Azure Service Bus Messaging and Subscribing solutions.
Azure Databricks, Azure Cosmos DB, Azure Data Factory, Azure Logic Apps, Azure Functions.
Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort