Cutshort logo
Etl management jobs

7+ ETL management Jobs in India

Apply to 7+ ETL management Jobs on CutShort.io. Find your next job, effortlessly. Browse ETL management Jobs and apply today!

icon
Lightningrowth

at Lightningrowth

1 candid answer
Reagan Ahlquist
Posted by Reagan Ahlquist
Remote only
3 - 6 yrs
$30K - $50K / yr
skill iconPython
RESTful APIs
SQL
English Proficiency
Facebook API
+3 more

Marketing Data Engineer (Remote)


Full-Time Contractor

Lightningrowth is a U.S.-based marketing company that specializes in Facebook lead generation for home-remodeling businesses. Although all ads run on Facebook, our clients use many different CRMs — which means we must manage, clean, and sync large volumes of lead data across multiple systems.

We’re hiring a Marketing Data Engineer to maintain and improve the Python scripts and data pipelines that keep everything running smoothly.

This is a remote role ideal for a mid-level engineer with strong Python, API, SQL, and communication skills.


What You’ll Do

  • Maintain and improve Python scripts for:
  • GoHighLevel (GHL) API
  • Facebook/Meta Marketing API
  • Build new API integrations for client CRMs and software tools
  • Extract, clean, and transform data before loading into BigQuery
  • Write and update SQL used for dashboards and reporting
  • Ensure data accuracy and monitor automated pipeline reliability
  • Help optimize automation flows (Make.com or similar)
  • Document your work clearly and communicate updates to the team

Required Skills

  • Strong Python (requests, pandas, JSON handling)
  • Hands-on experience with REST APIs (auth, pagination, rate limits)
  • Solid SQL skills (BigQuery experience preferred)
  • Experience with ETL / data pipelines
  • Ability to build API integrations from documentation
  • Good spoken and written English communication
  • Comfortable working independently in a fully remote setup

Nice to Have

  • Experience with GoHighLevel or CRM APIs
  • Familiarity with:
  • Google BigQuery
  • Google Cloud Functions / Cloud Run
  • Make.com automations
  • Looker Studio dashboards
  • Experience optimizing large datasets or API usage

Experience Level

3–6 years of hands-on data engineering, backend Python work, or API integrations.

Compensation

  • $2,500 – $4,000 USD per month (depending on experience)

How to Apply

Please include:

  • Your resume
  • Links to any Python/API/SQL samples (GitHub, snippets, etc.)
  • A short note on why you’re a good fit

Qualified candidates will complete a short Python + API + SQL test.

Read more
MyOperator - VoiceTree Technologies

at MyOperator - VoiceTree Technologies

1 video
3 recruiters
Vijay Muthu
Posted by Vijay Muthu
Remote only
3 - 5 yrs
₹12L - ₹20L / yr
skill iconPython
skill iconDjango
MySQL
skill iconPostgreSQL
Microservices architecture
+26 more

About Us:

MyOperator and Heyo are India’s leading conversational platforms, empowering 40,000+ businesses with Call and WhatsApp-based engagement. We’re a product-led SaaS company scaling rapidly, and we’re looking for a skilled Software Developer to help build the next generation of scalable backend systems.


Role Overview:

We’re seeking a passionate Python Developer with strong experience in backend development and cloud infrastructure. This role involves building scalable microservices, integrating AI tools like LangChain/LLMs, and optimizing backend performance for high-growth B2B products.


Key Responsibilities:

  • Develop robust backend services using Python, Django, and FastAPI
  • Design and maintain a scalable microservices architecture
  • Integrate LangChain/LLMs into AI-powered features
  • Write clean, tested, and maintainable code with pytest
  • Manage and optimize databases (MySQL/Postgres)
  • Deploy and monitor services on AWS
  • Collaborate across teams to define APIs, data flows, and system architecture

Must-Have Skills:

  • Python and Django
  • MySQL or Postgres
  • Microservices architecture
  • AWS (EC2, RDS, Lambda, etc.)
  • Unit testing using pytest
  • LangChain or Large Language Models (LLM)
  • Strong grasp of Data Structures & Algorithms
  • AI coding assistant tools (e.g., Chat GPT & Gemini)

Good to Have:

  • MongoDB or ElasticSearch
  • Go or PHP
  • FastAPI
  • React, Bootstrap (basic frontend support)
  • ETL pipelines, Jenkins, Terraform

Why Join Us?

  • 100% Remote role with a collaborative team
  • Work on AI-first, high-scale SaaS products
  • Drive real impact in a fast-growing tech company
  • Ownership and growth from day one


Read more
MindCrew Technologies

at MindCrew Technologies

3 recruiters
Agency job
Pune
8 - 12 yrs
₹10L - ₹15L / yr
Data engineering
Data modeling
Snow flake schema
ETL
ETL architecture
+3 more

Job Title: Lead Data Engineer

📍 Location: Pune

🧾 Experience: 10+ Years

💰 Budget: Up to 1.7 LPM


Responsibilities

  • Collaborate with Data & ETL teams to review, optimize, and scale data architectures within Snowflake.
  • Design, develop, and maintain efficient ETL/ELT pipelines and robust data models.
  • Optimize SQL queries for performance and cost efficiency.
  • Ensure data quality, reliability, and security across pipelines and datasets.
  • Implement Snowflake best practices for performance, scaling, and governance.
  • Participate in code reviews, knowledge sharing, and mentoring within the data engineering team.
  • Support BI and analytics initiatives by enabling high-quality, well-modeled datasets.


Read more
Enqubes

Enqubes

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Remote only
7 - 10 yrs
₹10L - ₹15L / yr
SAP BODS
SAP HANA
HANA
ETL management
ETL
+3 more

Job Title: SAP BODS Developer

  • Experience: 7–10 Years
  • Location: Remote (India-based candidates only)
  • Employment Type: Permanent (Full-Time)
  • Salary Range: ₹20 – ₹25 LPA (Fixed CTC)


Required Skills & Experience:

- 7–10 years of hands-on experience as a SAP BODS Developer.

- Strong experience in S/4HANA implementation or upgrade projects with large-scale data migration.

- Proficient in ETL development, job optimization, and performance tuning using SAP BODS.

- Solid understanding of SAP data structures (FI, MM, SD, etc.) from a technical perspective.

- Skilled in SQL scripting, error resolution, and job monitoring.

- Comfortable working independently in a remote, spec-driven development environment.


Read more
Metric Vibes

Metric Vibes

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Noida
4 - 8 yrs
₹10L - ₹15L / yr
PowerBI
skill iconJavascript
RESTful APIs
Embedded software
SQL
+9 more

Job Title: Tableau BI Developer

Years of Experience: 4-8Yrs

12$ per hour fte engagement

8 hrs. working


Required Skills & Experience:

✅ 4–8 years of experience in BI development and data engineering

✅ Expertise in BigQuery and/or Snowflake for large-scale data processing

✅ Strong SQL skills with experience writing complex analytical queries

✅ Experience in creating dashboards in tools like Power BI, Looker, or similar

✅ Hands-on experience with ETL/ELT tools and data pipeline orchestration

✅ Familiarity with cloud platforms (GCP, AWS, or Azure)

✅ Strong understanding of data modeling, data warehousing, and analytics best practices

✅ Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders

Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
Rani Galipalli
Posted by Rani Galipalli
Bengaluru (Bangalore), Pune, Mumbai
6 - 8 yrs
₹25L - ₹28L / yr
ETL
Informatica
Data Warehouse (DWH)
ETL management
SQL
+1 more

Your key responsibilities

 

  • Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
  • Responsible for development, support, maintenance, and implementation of a complex project module
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
  • complete reporting solutions.
  • Preparation of HLD about architecture of the application and high level design.
  • Preparation of LLD about job design, job description and in detail information of the jobs.
  • Preparation of Unit Test cases and execution of the same.
  • Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle

Skills and attributes for success

 

  • Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
  • Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
  • Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
  • Should have enough experience to work on Power Shell Scripting
  • Able to guide the team through the development, testing and implementation stages and review the completed work effectively
  • Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
  • Primary owner of delivery, timelines. Review code was written by other engineers.
  • Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
  • Must have understanding of business intelligence development in the IT industry
  • Outstanding written and verbal communication skills
  • Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
  • Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
  • Should be able to orchestrate and automate pipeline
  • Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark

 

To qualify for the role, you must have

 

  • Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
  • More than 6 years of experience in ETL development projects
  • Proven experience in delivering effective technical ETL strategies
  • Microsoft Azure project experience
  • Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)

 

Ideally, you’ll also have

Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Remote only
5 - 8 yrs
₹9L - ₹14L / yr
ETL
Informatica
Data Warehouse (DWH)
API
Microsoft Windows Azure
+2 more
Responsible for the design of solutions using Azure Data Integration Services. This will include
 
Solution design using Microsoft Azure services and related tools.
Design of enterprise data models and Data Warehouse solutions.
Specification of ETL pipelines, data integration and data migration design.
Design & implementation of Master data management solutions.
Specification of Data Quality Management methodology and supporting technology tools.
Working within a project management/agile delivery methodology in a leading role as part of a wider team.
Deal with other stakeholders/ end users in the software development lifecycle – Scrum Masters, Product Owners, Data Architects, and testing teams.
 
Specific Technologies within the Microsoft Azure Cloud stack include:
 
API Development and APIM registration of REST/SOAP API's.
Azure Service Bus Messaging and Subscribing solutions.
Azure Databricks, Azure Cosmos DB, Azure Data Factory, Azure Logic Apps, Azure Functions.
Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort