Cutshort logo
Etl management jobs

4+ ETL management Jobs in India

Apply to 4+ ETL management Jobs on CutShort.io. Find your next job, effortlessly. Browse ETL management Jobs and apply today!

icon
Enqubes

Enqubes

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Remote only
7 - 10 yrs
₹10L - ₹15L / yr
SAP BODS
SAP HANA
HANA
ETL management
ETL
+3 more

Job Title: SAP BODS Developer

  • Experience: 7–10 Years
  • Location: Remote (India-based candidates only)
  • Employment Type: Permanent (Full-Time)
  • Salary Range: ₹20 – ₹25 LPA (Fixed CTC)


Required Skills & Experience:

- 7–10 years of hands-on experience as a SAP BODS Developer.

- Strong experience in S/4HANA implementation or upgrade projects with large-scale data migration.

- Proficient in ETL development, job optimization, and performance tuning using SAP BODS.

- Solid understanding of SAP data structures (FI, MM, SD, etc.) from a technical perspective.

- Skilled in SQL scripting, error resolution, and job monitoring.

- Comfortable working independently in a remote, spec-driven development environment.


Read more
Metric Vibes

Metric Vibes

Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Noida
4 - 8 yrs
₹10L - ₹15L / yr
PowerBI
Tableau
skill iconJavascript
RESTful APIs
Embedded software
+7 more

Job Title: Tableau BI Developer

Years of Experience: 4-8Yrs

12$ per hour fte engagement

8 hrs. working


Required Skills and Experience:

● 4–8 years of hands-on experience developing solutions using Tableau Desktop and Tableau Server/Tableau Cloud.

● Proven experience with embedding Tableau dashboards into portals, apps, or third-party systems using JavaScript API, REST API, and other embedding techniques.

● Proficient in writing complex SQL queries and working with large datasets.

● Strong experience with at least one RDBMS (e.g., Snowflake, Redshift, SQL Server, PostgreSQL, etc.).

● Familiarity with web technologies including JavaScript, HTML, and CSS for embedded visual customization.

● Experience working with data pipelines and ETL processes.

● Solid understanding of data visualization principles and storytelling.

● Ability to work independently and manage multiple projects with tight deadlines.

● Strong verbal and written communication skills, including experience working with non-technical stakeholders.


Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
Rani Galipalli
Posted by Rani Galipalli
Bengaluru (Bangalore), Pune, Mumbai
6 - 8 yrs
₹25L - ₹28L / yr
ETL
Informatica
Data Warehouse (DWH)
ETL management
SQL
+1 more

Your key responsibilities

 

  • Create and maintain optimal data pipeline architecture. Should have experience in building batch/real-time ETL Data Pipelines. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • The individual will be responsible for solution design, integration, data sourcing, transformation, database design and implementation of complex data warehousing solutions.
  • Responsible for development, support, maintenance, and implementation of a complex project module
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support.
  • complete reporting solutions.
  • Preparation of HLD about architecture of the application and high level design.
  • Preparation of LLD about job design, job description and in detail information of the jobs.
  • Preparation of Unit Test cases and execution of the same.
  • Provide technical guidance and mentoring to application development teams throughout all the phases of the software development life cycle

Skills and attributes for success

 

  • Strong experience in SQL. Proficient in writing performant SQL working with large data volumes. Proficiency in writing and debugging complex SQLs.
  • Strong experience in database system Microsoft Azure. Experienced in Azure Data Factory.
  • Strong in Data Warehousing concepts. Experience with large-scale data warehousing architecture and data modelling.
  • Should have enough experience to work on Power Shell Scripting
  • Able to guide the team through the development, testing and implementation stages and review the completed work effectively
  • Able to make quick decisions and solve technical problems to provide an efficient environment for project implementation
  • Primary owner of delivery, timelines. Review code was written by other engineers.
  • Maintain highest levels of development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability
  • Must have understanding of business intelligence development in the IT industry
  • Outstanding written and verbal communication skills
  • Should be adept in SDLC process - requirement analysis, time estimation, design, development, testing and maintenance
  • Hands-on experience in installing, configuring, operating, and monitoring CI/CD pipeline tools
  • Should be able to orchestrate and automate pipeline
  • Good to have : Knowledge of distributed systems such as Hadoop, Hive, Spark

 

To qualify for the role, you must have

 

  • Bachelor's Degree in Computer Science, Economics, Engineering, IT, Mathematics, or related field preferred
  • More than 6 years of experience in ETL development projects
  • Proven experience in delivering effective technical ETL strategies
  • Microsoft Azure project experience
  • Technologies: ETL- ADF, SQL, Azure components (must-have), Python (nice to have)

 

Ideally, you’ll also have

Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Remote only
5 - 8 yrs
₹9L - ₹14L / yr
ETL
Informatica
Data Warehouse (DWH)
API
Microsoft Windows Azure
+2 more
Responsible for the design of solutions using Azure Data Integration Services. This will include
 
Solution design using Microsoft Azure services and related tools.
Design of enterprise data models and Data Warehouse solutions.
Specification of ETL pipelines, data integration and data migration design.
Design & implementation of Master data management solutions.
Specification of Data Quality Management methodology and supporting technology tools.
Working within a project management/agile delivery methodology in a leading role as part of a wider team.
Deal with other stakeholders/ end users in the software development lifecycle – Scrum Masters, Product Owners, Data Architects, and testing teams.
 
Specific Technologies within the Microsoft Azure Cloud stack include:
 
API Development and APIM registration of REST/SOAP API's.
Azure Service Bus Messaging and Subscribing solutions.
Azure Databricks, Azure Cosmos DB, Azure Data Factory, Azure Logic Apps, Azure Functions.
Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort