3+ ETL Jobs in Kolkata | ETL Job openings in Kolkata
Apply to 3+ ETL Jobs in Kolkata on CutShort.io. Explore the latest ETL Job opportunities across top companies like Google, Amazon & Adobe.

Senior ETL developer in SAS
We are seeking a skilled and experienced ETL Developer with strong SAS expertise to join
our growing Data Management team in Kolkata. The ideal candidate will be responsible for
designing, developing, implementing, and maintaining ETL processes to extract, transform,
and load data from various source systems into Banking data warehouse and other data
repositories of BFSI. This role requires a strong understanding of Banking data warehousing
concepts, ETL methodologies, and proficiency in SAS programming for data manipulation
and analysis.
Responsibilities:
• Design, develop, and implement ETL solutions using industry best practices and
tools, with a strong focus on SAS.
• Develop and maintain SAS programs for data extraction, transformation, and loading.
• Work with source system owners and data analysts to understand data requirements
and translate them into ETL specifications.
• Build and maintain data pipelines for Banking database to ensure data quality,
integrity, and consistency.
• Perform data profiling, data cleansing, and data validation to ensure accuracy and
reliability of data.
• Troubleshoot and resolve Bank’s ETL-related issues, including data quality problems
and performance bottlenecks.
• Optimize ETL processes for performance and scalability.
• Document ETL processes, data flows, and technical specifications.
• Collaborate with other team members, including data architects, data analysts, and
business users.
• Stay up-to-date with the latest SAS related ETL technologies and best practices,
particularly within the banking and financial services domain.
• Ensure compliance with data governance policies and security standards.
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, or a related field.
• Proven experience as an ETL Developer, preferably within the banking or financial
services industry.
• Strong proficiency in SAS programming for data manipulation and ETL processes.
• Experience with other ETL tools (e.g., Informatica PowerCenter, DataStage, Talend)
is a plus.
• Solid understanding of data warehousing concepts, including dimensional modeling
(star schema, snowflake schema).
• Experience working with relational databases (e.g., Oracle, SQL Server) and SQL.
• Familiarity with data quality principles and practices.
• Excellent analytical and problem-solving skills.
• Strong communication and interpersonal skills.
• Ability to work independently and as part of a team.
• Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.
• Understanding of regulatory requirements in the banking sector (e.g., RBI guidelines)
is an advantage.
Preferred Skills:
• Experience with cloud-based data warehousing solutions (e.g., AWS Redshift, Azure
Synapse, Google BigQuery).
• Knowledge of big data technologies (e.g., Hadoop, Spark).
• Experience with agile development methodologies.
• Relevant certifications (e.g., SAS Certified Professional).
What We Offer:
• Competitive salary and benefits package.
• Opportunity to work with cutting-edge technologies in a dynamic environment.
• Exposure to the banking and financial services domain.
• Professional development and growth opportunities.
• A collaborative and supportive work culture.

We are a rapidly expanding global technology partner, that is looking for a highly skilled Senior (Python) Data Engineer to join their exceptional Technology and Development team. The role is in Kolkata. If you are passionate about demonstrating your expertise and thrive on collaborating with a group of talented engineers, then this role was made for you!
At the heart of technology innovation, our client specializes in delivering cutting-edge solutions to clients across a wide array of sectors. With a strategic focus on finance, banking, and corporate verticals, they have earned a stellar reputation for their commitment to excellence in every project they undertake.
We are searching for a senior engineer to strengthen their global projects team. They seek an experienced Senior Data Engineer with a strong background in building Extract, Transform, Load (ETL) processes and a deep understanding of AWS serverless cloud environments.
As a vital member of the data engineering team, you will play a critical role in designing, developing, and maintaining data pipelines that facilitate data ingestion, transformation, and storage for our organization.
Your expertise will contribute to the foundation of our data infrastructure, enabling data-driven decision-making and analytics.
Key Responsibilities:
- ETL Pipeline Development: Design, develop, and maintain ETL processes using Python, AWS Glue, or other serverless technologies to ingest data from various sources (databases, APIs, files), transform it into a usable format, and load it into data warehouses or data lakes.
- AWS Serverless Expertise: Leverage AWS services such as AWS Lambda, AWS Step Functions, AWS Glue, AWS S3, and AWS Redshift to build serverless data pipelines that are scalable, reliable, and cost-effective.
- Data Modeling: Collaborate with data scientists and analysts to understand data requirements and design appropriate data models, ensuring data is structured optimally for analytical purposes.
- Data Quality Assurance: Implement data validation and quality checks within ETL pipelines to ensure data accuracy, completeness, and consistency.
- Performance Optimization: Continuously optimize ETL processes for efficiency, performance, and scalability, monitoring and troubleshooting any bottlenecks or issues that may arise.
- Documentation: Maintain comprehensive documentation of ETL processes, data lineage, and system architecture to ensure knowledge sharing and compliance with best practices.
- Security and Compliance: Implement data security measures, encryption, and compliance standards (e.g., GDPR, HIPAA) as required for sensitive data handling.
- Monitoring and Logging: Set up monitoring, alerting, and logging systems to proactively identify and resolve data pipeline issues.
- Collaboration: Work closely with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, to understand data requirements and deliver solutions.
- Continuous Learning: Stay current with industry trends, emerging technologies, and best practices in data engineering and cloud computing and apply them to enhance existing processes.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- Proven experience as a Data Engineer with a focus on ETL pipeline development.
- Strong proficiency in Python programming.
- In-depth knowledge of AWS serverless technologies and services.
- Familiarity with data warehousing concepts and tools (e.g., Redshift, Snowflake).
- Experience with version control systems (e.g., Git).
- Strong SQL skills for data extraction and transformation.
- Excellent problem-solving and troubleshooting abilities.
- Ability to work independently and collaboratively in a team environment.
- Effective communication skills for articulating technical concepts to non-technical stakeholders.
- Certifications such as AWS Certified Data Analytics - Specialty or AWS Certified DevOps Engineer are a plus.
Preferred Experience:
- Knowledge of data orchestration and workflow management tools
- Familiarity with data visualization tools (e.g., Tableau, Power BI).
- Previous experience in industries with strict data compliance requirements (e.g., insurance, finance) is beneficial.
What You Can Expect:
- Innovation Abounds: Join a company that constantly pushes the boundaries of technology and encourages creative thinking. Your ideas and expertise will be valued and put to work in pioneering solutions.
- Collaborative Excellence: Be part of a team of engineers who are as passionate and skilled as you are. Together, you'll tackle challenging projects, learn from each other, and achieve remarkable results.
- Global Impact: Contribute to projects with a global reach and make a tangible difference. Your work will shape the future of technology in finance, banking, and corporate sectors.
They offer an exciting and professional environment with great career and growth opportunities. Their office is located in the heart of Salt Lake Sector V, offering a terrific workspace that's both accessible and inspiring. Their team members enjoy a professional work environment with regular team outings. Joining the team means becoming part of a vibrant and dynamic team where your skills will be valued, your creativity will be nurtured, and your contributions will make a difference. In this role, you can work alongside some of the brightest minds in the industry.
If you're ready to take your career to the next level and be part of a dynamic team that's driving innovation on a global scale, we want to hear from you.
Apply today for more information about this exciting opportunity.
Onsite Location: Kolkata, India (Salt Lake Sector V)
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.