4+ IBM InfoSphere DataStage Jobs in India
Apply to 4+ IBM InfoSphere DataStage Jobs on CutShort.io. Find your next job, effortlessly. Browse IBM InfoSphere DataStage Jobs and apply today!
Job Title: Senior ETL Developer (DataStage and SQL)
Location: Pune
Overview:
We’re looking for a Senior ETL Developer with 5+ years of experience in ETL development, strong DataStage and SQL skills, and a track record in complex data integration projects.
Responsibilities:
- Develop and maintain ETL processes using IBM DataStage and SQL for data warehousing.
- Write advanced SQL queries to transform and validate data.
- Troubleshoot ETL jobs, optimize performance, and ensure data quality.
- Document ETL workflows and adhere to coding standards.
- Lead and mentor junior developers, providing technical guidance.
- Collaborate with architects and analysts to deliver scalable solutions.
Qualifications:
- 5+ years in ETL development; 5+ years with IBM DataStage.
- Advanced SQL skills and experience with relational databases.
- Strong understanding of data warehousing and data integration.
- Experience in performance tuning and ETL process optimization.
- Team player with leadership abilities and excellent problem-solving skills.
Role: ETL Datastage developer.
Eperience: 5 years
Location: Bangalore (WFH as of now).
Roles:
Design, develop, and schedule DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools.
Provides hands-on technical solutions to business challenges & translates them into process/technical solutions.
Conduct code reviews to communicate high-level design approaches with team members to validate strategic business needs and architectural guidelines are met.
Evaluate and recommend technical feasibility and effort estimates of proposed technology solutions. Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes.
Coordinate Control-M scheduler jobs and dependencies Recommend and implement ETL process performance tuning strategies and methodologies. Conducts and supports data validation, unit testing, and QA integration activities.
Compose and update technical documentation to ensure compliance to department policies and standards. Create transformation queries, stored procedures for ETL processes, and development automations.
Interested candidates can forward your profiles.
- Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
- • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
- Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing.
Datametica is Hiring for Datastage Developer
- Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
- Should have extensive knowledge in Unix shell scripting.
- Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
- Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
- Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
- Should be good in writing complex SQL queries.
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com