11+ IBM InfoSphere DataStage Jobs in Hyderabad | IBM InfoSphere DataStage Job openings in Hyderabad
Apply to 11+ IBM InfoSphere DataStage Jobs in Hyderabad on CutShort.io. Explore the latest IBM InfoSphere DataStage Job opportunities across top companies like Google, Amazon & Adobe.
Who we are :
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
What You Will Do :
As a Data Governance Developer at Kanerika, you will be responsible for building and managing metadata, lineage, and compliance frameworks across the organizations data ecosystem.
Required Qualifications :
- 4 to 6 years of experience in data governance or data management.
- Strong experience in Microsoft Purview and Informatica governance tools.
- Proficient in tracking and visualizing data lineage across systems.
- Familiar with Azure Data Factory, Talend, dbt, and other integration tools.
- Understanding of data regulations : GDPR, CCPA, SOX, HIPAA.
- Ability to translate technical data governance concepts for business stakeholders.
Tools & Technologies :
- Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog
- Experience in Microsoft Purview areas :
1. Label creation and policy management
2. Publish/Auto-labeling
3. Data Loss Prevention & Compliance handling
4. Compliance Manager, Communication Compliance, Insider Risk Management
5. Records Management, Unified Catalog, Information Barriers
6. eDiscovery, Data Map, Lifecycle Management, Compliance Alerts, Audit
7. DSPM, Data Policy
Key Responsibilities :
- Set up and manage Microsoft Purview accounts, collections, and access controls (RBAC).
- Integrate Purview with data sources : Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake.
- Schedule and monitor metadata scanning and classification jobs.
- Implement and maintain collection hierarchies aligned with data ownership.
- Design metadata ingestion workflows for technical, business, and operational metadata.
- Enrich data assets with business context : descriptions, glossary terms, tags.
- Synchronize metadata across tools using REST APIs, PowerShell, or ADF.
- Validate end-to-end lineage for datasets and reports (ADF ? Synapse ? Power BI).
- Resolve lineage gaps or failures using mapping corrections or scripts.
- Perform impact analysis to support downstream data consumers.
- Create custom classification rules for sensitive data (PII, PCI, PHI).
- Apply and manage Microsoft Purview sensitivity labels and policies.
- Integrate with Microsoft Information Protection (MIP) for DLP.
- Manage business glossary in collaboration with domain owners and stewards.
- Implement approval workflows and term governance.
- Conduct audits for glossary and metadata quality and consistency.
- Automate Purview operations using :
- PowerShell, Azure Functions, Logic Apps, REST APIs
- Build pipelines for dynamic source registration and scanning.
- Automate tagging, lineage, and glossary term mapping.
- Enable operational insights using Power BI, Synapse Link, Azure Monitor, and governance APIs.
We are looking for a highly motivated Sales Executive to maintain and develop B2B customer relationships within the food ingredients segment, including Bakery, Protein, Snacks, Premix, Beverages, RTD, RTE, QSR, and online food businesses. The candidate will drive introductions of new ingredients and concepts to existing and new customers, ultimately growing the business in line with company sales policies.
Key Responsibilities:
- Manage sales operations by exploring customer needs and requirements.
- Align sales strategy with marketing approach, including product/market segmentation, pricing, positioning, and business drivers.
- Continuously monitor market trends and competitors to provide value-added solutions.
- Translate sales budgets into actionable objectives, focusing on customer centricity.
- Take direct responsibility for sales targets and ensuring a superior customer experience.
- Drive new business development through concept selling and build a robust project pipeline for ingredients.
- Collaborate cross-functionally across the organization to create distinctive value.
Requirements
- Strong understanding of B2B sales in the food ingredients or related industry.
- Excellent communication, negotiation, and presentation skills.
- Ability to work independently and in a team, manage multiple accounts, and meet targets.
- Market awareness and strategic thinking to compete effectively.
Additional Notes:
- Freshers can apply for Mumbai location and New Delhi.
- Male candidates only.
Benefits
- Learning and professional development opportunities
- Exposure to on-ground marketing and customer interaction
- Career growth and advancement
📍 Location: Hyderabad
🏗️ Role: Planning & QS Engineer
🏘️ Project: 110 Triplex Villas (3.35 Lakh SFT)
🆕 Requirement: New | 📑 Reporting to: Project Manager
🎓 Education: BE Civil
🧪 Experience: 6+ yrs (QS, Billing & Planning)
✨ Key Skills:
🗓️ Planning: Master schedule prep, weekly tracking, slippage recovery
📂 Document Control: Drawings, DBR, MOM, approvals, review meetings, reports
📊 QS & Billing: Valuation, vendor bill certification, cost variation, budget tracking, EVA
⭐ Other: Strong communication, residential project exp, villa project preferred
Design and develop scalable web applications using MEAN/MERN stack.
Build & optimize AI/LLM workflows using LangChain or LangGraph.
Implement vector storage & semantic search using FAISS / Pinecone / Chroma / Milvus.
Build APIs, microservices, and integration layers.
Optimize application performance and ensure code quality.
Collaborate with cross-functional teams (product, design, backend, DevOps).
Must-Have Skills
- Strong experience in Node.js, Express.js, MongoDB, and Angular/React.
- Hands-on experience in LLM apps, RAG pipelines, Vector Databases.
- Practical knowledge of LangChain / LangGraph.
- Experience with REST APIs, authentication, and integrations.
- Solid understanding of Git, CI/CD, and cloud platforms (AWS/Azure/GCP).
Backend Architect:
Technology: node js, DynamoDB / Mongo DB
Roles:
- Design & implement Backend Services.
- Able to redesign the architecture.
- Designing & implementation of application in MVC & Microservice.
- 9+ years of experience developing service-based applications using Node.js.
- Expert-level skills in developing web applications using JavaScript, CSS and HTML5.
- Experience working on teams that practice BDD (Business Driven Development).
- Understanding of micro-service architecture and RESTful API integration patterns.
- Experience using Node.js for automation and leveraging NPM for package management
- Solid Object Oriented design experience, and creating and leveraging design patterns.
- Experience working in a DevOps/Continuous Delivery environment and associated toolsets (i.e. Jenkins, Puppet etc.)
Desired/Preferred Qualifications :
- Bachelor's degree or equivalent experience
- Strong problem solving and conceptual thinking abilities
- Desire to work in a collaborative, fast-paced, start-up like environment
- Experience leveraging node.js frameworks such as Express.
- Experience with distributed source control management, i.e. Git
Must have very good experience in Java development.
Must have a very good understanding of data structure & algorithms.
Must be very good at coding & problem solving skills.
Must be very good in multithreading, Collections, OOPS concepts.
Good to have experience in Java 8.
We are seeking a highly skilled and experienced Offshore Data Engineer . The role involves designing, implementing, and testing data pipelines and products.
Qualifications & Experience:
bachelor's or master's degree in computer science, Information Systems, or a related field.
5+ years of experience in data engineering, with expertise in data architecture and pipeline development.
☁️ Proven experience with GCP, Big Query, Databricks, Airflow, Spark, DBT, and GCP Services.
️ Hands-on experience with ETL processes, SQL, PostgreSQL, MySQL, MongoDB, Cassandra.
Strong proficiency in Python and data modelling.
Experience in testing and validation of data pipelines.
Preferred: Experience with eCommerce systems, data visualization tools (Tableau, Looker), and cloud certifications.
If you meet the above criteria and are interested, please share your updated CV along with the following details:
Total Experience:
Current CTC:
Expected CTC:
Current Location:
Preferred Location:
Notice Period / Last Working Day (if serving notice):
⚠️ Kindly share your details only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response!
About Monarch:
At Monarch, we’re leading the digital transformation of farming. Monarch Tractor augments both muscle and mind with fully loaded hardware, software, and service machinery that will spur future generations of farming technologies.
With our farmer-first mentality, we are building a smart tractor that will enhance (not replace) the existing farm ecosystem, alleviate labor availability, and cost issues, and provide an avenue for competitive organic and beyond farming by providing mechanical solutions to replace harmful chemical solutions. Despite all the cutting-edge technology we will incorporate, our tractor will still plow, till, and haul better than any other tractor in its class. We have all the necessary ingredients to develop, build and scale the Monarch Tractor and digitally transform farming around the world.
Job Description
DevOps:
- Implementing various development, testing and automation tools
- Setting up tools and required infrastructure
- Configuration and managing databases such as MySQL
- Defining and setting development, test, release, update, and support processes for DevOps operation
- Identify sources of manual work related to new features from inception to CI/CD pipeline and automate as much as possible.
- Monitoring the processes during the entire lifecycle for its adherence as well as updating or creating new processes for improvement and minimizing the wastage
Security:
- Identifying and deploying cybersecurity measures by continuously performing vulnerability assessment and risk management
- Incidence management and root cause analysis
- Disaster Recovery
General:
- Coordination and communication within the team and with customers
- Thrive in a fast-paced environment and have the ability to own the project end to end with minimum hand holding.
Required:
- Bachelor’s degree / Masters in Engineering (ECE or CSE preferred)
- A minimum work experience of 5 years in the relevant field on Linux based infrastructure
- Work Experience with AWS: (EC2, RDS, ELB, EBD, S3, VPC, Glacier, IAM, CloudWatch, KMS and Cognito)
- Security suit implementation experience
- Bash and Python scripting
- Work experience in setting up VPN and VNC accounts
- CI/CD tools: Jenkins, GitLab CI/CD, and Spinnaker
- Automation Tools: Ansible, Docker, Kubernetes
- Version Control: Git / GitHub
- Knowledge of AWS IoT Core is a plus
What you will get:
At Monarch Tractor, you’ll play a key role on a capable, dedicated, high-performing team of rock stars. Our compensation package includes a competitive salary, excellent health, dental and vision benefits, and company equity commensurate with the role you’ll play in our success.
The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new software products and enhancements to existing products. You should excel in working with large-scale applications and frameworks and have outstanding communication and aleadership skills.
Responsibilities
- Writing clean, high-quality, high-performance, maintainable code
- Develop and support software including applications, database integration, interfaces, and new functionality enhancements
- Coordinate cross-functionally to insure project meets business objectives and compliance standards
- Support test and deployment of new products and features
- Participate in code reviews
Qualifications
- 5+ years of relevant work experience
- Mandatory experience in building scalable microservices on nodejs platforms
- Expertise in Object Oriented Design, Database Design, Service architecture
- Experience with Agile or Scrum software development methodologies
- Ability to multi-task, organize, and prioritize work
| Job Description |
|
|
|
- software engineering experience focused on web development.
- Hands on experience in architecting large scale frontend applications.
- Have detailed hands-on experience of cutting edge web technologies (HTML5, CSS, Javascript, TypeScript), Application Servers, Web Applications
- Experience developing UI with JavaScript development frameworks (e.g. React, Angular, Vue, Bootstrap, etc)





