11+ Data Transformation Services Jobs in Hyderabad | Data Transformation Services Job openings in Hyderabad
Apply to 11+ Data Transformation Services Jobs in Hyderabad on CutShort.io. Explore the latest Data Transformation Services Job opportunities across top companies like Google, Amazon & Adobe.
Role & Responsibilities:
We are looking for a strong Data Engineer to join our growing team. The ideal candidate brings solid ETL fundamentals, hands-on pipeline experience, and cloud platform proficiency — with a preference for GCP / BigQuery expertise.
Responsibilities:
- Design, build, and maintain scalable data pipelines and ETL/ELT workflows
- Work with Dataform or DBT to implement transformation logic and data models
- Develop and optimize data solutions on GCP (BigQuery, GCS) or AWS/Azure
- Support data migration initiatives and data mesh architecture patterns
- Collaborate with analysts, scientists, and business stakeholders to deliver reliable data products
- Apply data governance and quality best practices across the data lifecycle
- Troubleshoot pipeline issues and drive proactive monitoring and resolution
Ideal Candidate:
- Strong Data Engineer Profile
- Must have 6+ years of hands-on experience in Data Engineering, with strong ownership of end-to-end data pipeline development.
- Must have strong experience in ETL/ELT pipeline design, transformation logic, and data workflow orchestration.
- Must have hands-on experience with any one of the following: Dataform, dbt, or BigQuery, with practical exposure to data transformation, modeling, or cloud data warehousing.
- Must have working experience on any cloud platform: GCP (preferred), AWS, or Azure, including object storage (GCS, S3, ADLS).
- Must have strong SQL skills with experience in writing complex queries and optimizing performance.
- Must have programming experience in Python and/or SQL for data processing.
- Must have experience in building and maintaining scalable data pipelines and troubleshooting data issues.
- Exposure to data migration projects and/or data mesh architecture concepts.
- Experience with Spark / PySpark or large-scale data processing frameworks.
- Experience working in product-based companies or data-driven environments.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
NOTE:
- There will be an interview drive scheduled on 28th and 29th March 2026, and if shortlisted, they will be expected to be available on these Interview dates. Only Immediate joiners are considered.
Key Skills:
• Hands-on experience with AWS services such as EC2, S3, Lambda, API Gateway, RDS, or DynamoDB ☁️
• Basic understanding of AI/ML concepts and experience with Python-based ML libraries (NumPy, Pandas, Scikit-learn, etc.) 🤖
• Experience in Python / Node.js / Java for backend development 💻
• Understanding of REST APIs and microservices architecture
• Familiarity with Git, CI/CD pipelines, and DevOps fundamentals
• Knowledge of Docker / containerization (preferred) 🐳
• Basic understanding of cloud security, IAM roles, and policies 🔐
• Experience in using AI tools (e.g., ChatGPT, GitHub Copilot, or similar tools) for development, debugging, documentation, and productivity in day-to-day tasks ⚡
Roles & Responsibilities:
• Develop and maintain cloud-based applications on AWS ☁️
• Build and integrate APIs and backend services
• Assist in deploying, monitoring, and managing applications on AWS infrastructure
• Work with the team to integrate AI/ML models or AI-powered services into applications 🤖
• Utilize AI tools for coding assistance, debugging, automation, and improving development efficiency
• Optimize applications for performance, scalability, and reliability
• Collaborate with cross-functional teams for design, development, and deployment
• Troubleshoot and resolve cloud or application-related issues
AWS Certification is mandatory
Education Qualification:
B.Tech/M.Tech from CSE/IT/AI/ML/ECE
Strong Full stack developer Profile
Mandatory (Experience 1) - Must Have Minimum 5+ YOE in Software Development,
Mandatory (Experience 2) - Must have 4+ YOE in backend using Python.
Mandatory (Experience 3) - Must have good experience in frontend using React JS with knowledge of HTML, CSS, and JavaScript.
Mandatory (Experience 4) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL / MongoDB / Neo4J
Preferred
Preferred (Core Skill 1) - Expertise with any CI / CD tool e.g. (Jenkins, GitLab CI / CD, CircleCI, Google Clod Build, AWS CodePipeline, Azure CI / CD etc)
Preferred (Core Skill 2) - Mandatory Expertise with any one cloud platforms (Azure / AWS / Google Cloud)
Preferred (Company) - Product Company
Certification is mandatory – PD1 /PD2.
Lightning web component Development and debugging
Worked on XML, JavaScript, CSS2 & CSS3, HTML5& jQuery
Integration and Dataloader experience
Apex Scripting and debugging
Salesforce Trigger
Flow design
Experience on Mulesoft Integration
Integration REST API/Lightning Messaging Services
Sales Cloud knowledge – preferred

About NxtWave:
NxtWave is one of India’s fastest-growing edtech startups, transforming the way students learn and build careers in tech. With a strong community of learners across the country, we’re building cutting-edge products that make industry-ready skills accessible and effective at scale.
What will you do:
- Build and ship full-stack features end-to-end (frontend, backend, data).
- Own your code – from design to deployment with CI/CD pipelines.
- Make key architectural decisions and implement scalable systems.
- Lead code reviews, enforce clean code practices, and mentor SDE-1s.
- Optimize performance across frontend (Lighthouse) and backend (tracing, metrics)
- Ensure secure, accessible, and SEO-friendly applications.
- Collaborate with Product, Design, and Ops to deliver fast and effectively.
- Work in a fast-paced, high-impact environment with rapid release cycles.
What we are expecting:
- 3–5 years of experience building production-grade full-stack applications.
- Proficiency in React (or Angular/Vue), TypeScript, Node.js / NestJS / Django / Spring Boot.
- Strong understanding of REST/GraphQL APIs, relational & NoSQL databases.
- Experience with Docker, AWS (Lambda, EC2, S3, API Gateway), Redis, Elasticsearch.
- Solid testing experience – unit, integration, and E2E (Jest, Cypress, Playwright).
- Strong problem-solving, communication, and team collaboration skills.
- Passion for learning, ownership, and building great software.
Location: Hyderabad (In-office)
Apply here:- https://forms.gle/QeoNC8LmWY6pwckX9
About Quadratyx:
We are a product-centric insight & automation services company globally. We help the world’s organizations make better & faster decisions using the power of insight & intelligent automation. We build and operationalize their next-gen strategy, through Big Data, Artificial Intelligence, Machine Learning, Unstructured Data Processing and Advanced Analytics. Quadratyx can boast more extensive experience in data sciences & analytics than most other companies in India.
We firmly believe in Excellence Everywhere.
Job Description
Purpose of the Job/ Role:
• As a Technical Lead, your work is a combination of hands-on contribution, customer engagement and technical team management. Overall, you’ll design, architect, deploy and maintain big data solutions.
Key Requisites:
• Expertise in Data structures and algorithms.
• Technical management across the full life cycle of big data (Hadoop) projects from requirement gathering and analysis to platform selection, design of the architecture and deployment.
• Scaling of cloud-based infrastructure.
• Collaborating with business consultants, data scientists, engineers and developers to develop data solutions.
• Led and mentored a team of data engineers.
• Hands-on experience in test-driven development (TDD).
• Expertise in No SQL like Mongo, Cassandra etc, preferred Mongo and strong knowledge of relational databases.
• Good knowledge of Kafka and Spark Streaming internal architecture.
• Good knowledge of any Application Servers.
• Extensive knowledge of big data platforms like Hadoop; Hortonworks etc.
• Knowledge of data ingestion and integration on cloud services such as AWS; Google Cloud; Azure etc.
Skills/ Competencies Required
Technical Skills
• Strong expertise (9 or more out of 10) in at least one modern programming language, like Python, or Java.
• Clear end-to-end experience in designing, programming, and implementing large software systems.
• Passion and analytical abilities to solve complex problems Soft Skills.
• Always speaking your mind freely.
• Communicating ideas clearly in talking and writing, integrity to never copy or plagiarize intellectual property of others.
• Exercising discretion and independent judgment where needed in performing duties; not needing micro-management, maintaining high professional standards.
Academic Qualifications & Experience Required
Required Educational Qualification & Relevant Experience
• Bachelor’s or Master’s in Computer Science, Computer Engineering, or related discipline from a well-known institute.
• Minimum 7 - 10 years of work experience as a developer in an IT organization (preferably Analytics / Big Data/ Data Science / AI background.
Job Description:
- Prospecting, generating, qualifying, and following up on leads and appointment setting for the external sales team
- Direct email marketing/cold calling to key clients and prospects
- Collaboratively work with the sales, and marketing team to develop lead generation strategies to generate lead opportunities with prospective customers
- Initiates lead/demand generation strategies that include inbound/outbound sales and marketing campaigns and initiatives
- Research, track, maintain, and update leads
- Develop a strong knowledge of the company’s products and services to facilitate the sales process
- Achieving sales lead generation and appointment quota
Key Responsibilities:
- Inside Sales
- Lead Generation
- Executive Email marketing and Cold calling
- International sales experience preferred
- Relevant IT/ERP sales experience preferred
Requirements:
- Minimum 2 years of Sales experience in the US market
- Bachelor's or relevant degree
- Need to work in US timings
- Immediate availability
1. You can rock with your expertise in fundamental front-end languages such as HTML, CSS, and JavaScript.
2. You can make value additions aided by your familiarity with advanced JavaScript libraries and frameworks such as React.
3. You have top notch ability in server-side languages such as NodeJS, Microservice.
4. Database technology such as MySQL also falls in your comfort zone.
5. You can enhance our world with your Cloud Experience preferably AWS [EC2, RDS, S3, Lambda]
6. You take pride in your knowledge of code versioning tools such as Git or SVN.
7. Third party integration like payment gateway and any Plugins.
Responsibilities:
1. You’ll develop high-quality front-end architecture.
2. You’ll build solid back-end Microservices.
3. You’ll design and develop APIs and API documentation
4. You’ll help the team in designing and normalizing databases
5. You’ll ensure cross-platform optimization for web and mobile phones.
6. You’ll proactively ensure responsiveness of applications.
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment





