4+ IBM InfoSphere DataStage Jobs in India
Apply to 4+ IBM InfoSphere DataStage Jobs on CutShort.io. Find your next job, effortlessly. Browse IBM InfoSphere DataStage Jobs and apply today!

Senior ETL developer in SAS
We are seeking a skilled and experienced ETL Developer with strong SAS expertise to join
our growing Data Management team in Kolkata. The ideal candidate will be responsible for
designing, developing, implementing, and maintaining ETL processes to extract, transform,
and load data from various source systems into Banking data warehouse and other data
repositories of BFSI. This role requires a strong understanding of Banking data warehousing
concepts, ETL methodologies, and proficiency in SAS programming for data manipulation
and analysis.
Responsibilities:
• Design, develop, and implement ETL solutions using industry best practices and
tools, with a strong focus on SAS.
• Develop and maintain SAS programs for data extraction, transformation, and loading.
• Work with source system owners and data analysts to understand data requirements
and translate them into ETL specifications.
• Build and maintain data pipelines for Banking database to ensure data quality,
integrity, and consistency.
• Perform data profiling, data cleansing, and data validation to ensure accuracy and
reliability of data.
• Troubleshoot and resolve Bank’s ETL-related issues, including data quality problems
and performance bottlenecks.
• Optimize ETL processes for performance and scalability.
• Document ETL processes, data flows, and technical specifications.
• Collaborate with other team members, including data architects, data analysts, and
business users.
• Stay up-to-date with the latest SAS related ETL technologies and best practices,
particularly within the banking and financial services domain.
• Ensure compliance with data governance policies and security standards.
Qualifications:
• Bachelor's degree in Computer Science, Information Technology, or a related field.
• Proven experience as an ETL Developer, preferably within the banking or financial
services industry.
• Strong proficiency in SAS programming for data manipulation and ETL processes.
• Experience with other ETL tools (e.g., Informatica PowerCenter, DataStage, Talend)
is a plus.
• Solid understanding of data warehousing concepts, including dimensional modeling
(star schema, snowflake schema).
• Experience working with relational databases (e.g., Oracle, SQL Server) and SQL.
• Familiarity with data quality principles and practices.
• Excellent analytical and problem-solving skills.
• Strong communication and interpersonal skills.
• Ability to work independently and as part of a team.
• Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.
• Understanding of regulatory requirements in the banking sector (e.g., RBI guidelines)
is an advantage.
Preferred Skills:
• Experience with cloud-based data warehousing solutions (e.g., AWS Redshift, Azure
Synapse, Google BigQuery).
• Knowledge of big data technologies (e.g., Hadoop, Spark).
• Experience with agile development methodologies.
• Relevant certifications (e.g., SAS Certified Professional).
What We Offer:
• Competitive salary and benefits package.
• Opportunity to work with cutting-edge technologies in a dynamic environment.
• Exposure to the banking and financial services domain.
• Professional development and growth opportunities.
• A collaborative and supportive work culture.
Role: ETL Datastage developer.
Eperience: 5 years
Location: Bangalore (WFH as of now).
Roles:
Design, develop, and schedule DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools.
Provides hands-on technical solutions to business challenges & translates them into process/technical solutions.
Conduct code reviews to communicate high-level design approaches with team members to validate strategic business needs and architectural guidelines are met.
Evaluate and recommend technical feasibility and effort estimates of proposed technology solutions. Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes.
Coordinate Control-M scheduler jobs and dependencies Recommend and implement ETL process performance tuning strategies and methodologies. Conducts and supports data validation, unit testing, and QA integration activities.
Compose and update technical documentation to ensure compliance to department policies and standards. Create transformation queries, stored procedures for ETL processes, and development automations.
Interested candidates can forward your profiles.
- Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
- • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
- Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing.
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com