11+ Data Transformation Services Jobs in Pune | Data Transformation Services Job openings in Pune
Apply to 11+ Data Transformation Services Jobs in Pune on CutShort.io. Explore the latest Data Transformation Services Job opportunities across top companies like Google, Amazon & Adobe.
ROLES AND RESPONSIBILITIES:
We are looking for a Junior Data Engineer who will work under guidance to support data engineering tasks, perform basic coding, and actively learn modern data platforms and tools. The ideal candidate should have foundational SQL knowledge, basic exposure to Databricks. This role is designed for early-career professionals who are eager to grow into full data engineering responsibilities while contributing to data pipeline operations and analytical support.
Key Responsibilities-
- Support the development and maintenance of data pipelines and ETL/ELT workflows under mentorship.
- Write basic SQL queries, transformations, and assist with Databricks notebook tasks.
- Help troubleshoot data issues and contribute to ensuring pipeline reliability.
- Work with senior engineers and analysts to understand data requirements and deliver small tasks.
- Assist in maintaining documentation, data dictionaries, and process notes.
- Learn and apply data engineering best practices, coding standards, and cloud fundamentals.
- Support basic tasks related to Power BI data preparation or integrations as needed.
IDEAL CANDIDATE:
- Foundational SQL skills with the ability to write and understand basic queries.
- Basic exposure to Databricks, data transformation concepts, or similar data tools.
- Understanding of ETL/ELT concepts, data structures, and analytical workflows.
- Eagerness to learn modern data engineering tools, technologies, and best practices.
- Strong problem-solving attitude and willingness to work under guidance.
- Good communication and collaboration skills to work with senior engineers and analysts.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
Job Summary:
We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.
Key Responsibilities:
- Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
- Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
- Develop and manage infrastructure using Terraform or CloudFormation.
- Manage and orchestrate containers using Docker and Kubernetes (EKS).
- Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
- Write robust automation scripts using Python and Shell scripting.
- Monitor system performance and availability, and ensure high uptime and reliability.
- Execute and optimize SQL queries for MSSQL and PostgreSQL databases.
- Maintain clear documentation and provide technical support to stakeholders and clients.
Required Skills:
- Minimum 4+ years of experience in a DevOps or related role.
- Proven experience in client-facing engagements and communication.
- Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
- Proficiency in Infrastructure as Code using Terraform or CloudFormation.
- Hands-on experience with Docker and Kubernetes (EKS).
- Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
- Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
- Proficient in Python and Shell scripting.
Preferred Qualifications:
AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.
Experience working in Agile/Scrum environments.
Strong problem-solving and analytical skills.
Role Overview
You will play a crucial role in performance tuning of the product, focusing on response time, load, and scalability. The role demands hands-on expertise in performance testing tools and strong troubleshooting skills to collaborate effectively with development and architecture teams.
Key Responsibilities
- Design, develop, and execute performance test scripts using tools such as JMeter, LoadRunner, or RPT.
- Conduct multi-user scenario scripting and load/stress testing.
- Analyze performance test results and provide bottleneck analysis and recommendations.
- Collaborate with developers and architects to optimize performance across response time, scalability, and throughput.
- Monitor system health during performance testing and troubleshoot performance issues.
- Document test strategy, results, and provide actionable insights to improve product performance.
- Contribute to performance tuning and capacity planning for cloud-based applications (added advantage).
Required Skills & Experience
- 6–8 years of overall engineering product testing experience.
- At least 2 years in automation testing.
- At least 2 years in performance test analysis.
- Hands-on expertise in JMeter, RPT, or LoadRunner (mandatory).
- Strong background in performance engineering with the ability to troubleshoot and solve technical issues.
- Experience in cloud-based applications (preferred).
- Knowledge of C++ coding will be an added advantage.
- Excellent skills in performance monitoring, profiling, and analysis.
- Strong communication skills for technical discussions with development and architecture teams.
Job Title: IAC SRE Engineer
Location: Pune, Mumbai, Bangalore
Experience Required: 4 Years
Role Overview:
We are looking for experienced IAC Engineers with a strong background in Akamai, Data Structures & Algorithms (DSA), Java, and DevSecOps. The ideal candidate should have hands-on development experience, be proficient in writing Infrastructure as Code using Terraform, and demonstrate strong problem-solving skills.
Core Skills:
- Akamai – Strong experience in CDN, caching, and performance optimization.
- Data Structures & Algorithms (DSA) – Strong problem-solving and coding abilities.
- Java – Solid programming background and experience in development.
- DevSecOps – Understanding of integrating security in CI/CD pipelines and infrastructure.
Good to Have:
- WAF (Web Application Firewall) – Knowledge of WAF is a plus, though not mandatory.
Additional Skills:
- Experience with SRE (Site Reliability Engineering) practices is beneficial.
- Strong hands-on with Terraform for managing cloud infrastructure.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Experience:
- 4-10 years of Full Stack Application development experience is must.
- Deep understanding of client side development, coding in HTML5, CSS3, JavaScript ES6, and jQuery.
- Strong experience using a client side MVC framework such as NodeJS, React JS, NextJS and Redux.
- Top notch programming skills and write code for robust software design.
- Experience in responsive design, cross browser compatibility and website performance.
- Experience in object oriented design skills.
- Aggressive problem diagnosis and creative problem solving skills.
- Good understanding of how browsers and DOM work.
- Experience with Git/Bitbucket.
- Experience with handling Services, RESTful, APIs.
Responsibilities :
- To work with a cross-functional software development team on highly visible strategic projects as an expert level individual contributor to the coding tasks assigned.
- Responsible for development of highly responsive, web based UI in a flexible and well-structured frontend architecture.
- Experience working with remote data via REST and JSON
- Experience with third party libraries and APIs
- Produce well designed efficient code in a timely manner to meet delivery timescales/milestones.
- Mentor other developers on software best practices technical guidance and help grow their software development skill sets.
Responsibilities
- Use latest processes and technology in order to build scalable, distributed, and
fault-tolerant site/software systems.
- Research and evaluate new methodologies and technologies that improve the quality,
reliability, and performance of the frontend engineering’s site/software development
systems and processes
- Apply usability principles and procedures and optimize builds. Reuse through
documenting common frontend components that save the business time in executing
future projects.
- Emphasize a consumer-focused approach in building sites/software and works with both
senior and junior frontend teams in determining the prioritization and estimation of new
features and improvements.
Requirements/Qualification
- Minimum 6 years experience in software engineering. At least 3 years of frontend
experience working in React with Redux Framework
- Highly proficient in CSS3, HTML5, Javascript. Also proficient in tools like Bootstrap and
Webpack for building products across multiple screen resolutions and browsers.
- Familiarity with cross-browser compatibility issues and demonstrate design and user
interface/user experience skills.
- Excellent time management, multi-tasking and communication skills
Purpose of the Role
To design and implement the Peak AI System - a new system of intelligence that allows companies to quickly harness the power of AI.
The Opportunity
Peak is a Decision Intelligence company - we are on a mission to help organisations use AI to make great commercial decisions, all the time. Just as importantly, we are also focused on building an amazing company - one where we truly value our people & culture, and strive to make an amazing and diverse place to work. Our recent Best Companies award & 3 star accreditation for being one of the top companies to work for is a testament to this.
We have ambitious plans over the coming years; to launch and lead a new category of technology (Decision Intelligence), expand our operations and create the best working culture possible. This is a great time to join Peak and the Engineering team, as we start the next stage of our global growth.
The Role
Based in Jaipur or Pune, you will be working in a collaborative team on cutting edge technologies in a supportive and dynamic environment. Ultimately you are responsible for building the CODI and on-boarding new clients - this involves:
- Developing a good understanding of the solutions which Peak delivers, and how these link to Peak’s overall strategy.
- Making suggestions towards shaping the strategy for a feature and engineering design.
- Managing own workload and usually delivering unsupervised. Accountable for their own workstream or the work of a small team.
- Understanding Engineering priorities and is able to focus on these, helping others to remain focussed too
- Acting as the Lead Engineer on a project. Helps ensure others follow Peak processes, such as release and version control.
- An active member of the team, through useful contributions to projects and in team meetings.
- Supervising others. Deputising for a Lead and/or support them with tasks. Mentoring new joiners/interns and Masters students. Sharing knowledge and learnings with the team.
Required Skills and Experience
We are building a team of world class engineers in Jaipur / Pune, essentially we are looking for bright, talented engineers looking to work at the cutting edge of practical AI.
- Acquired strong proven professional programming experience.
- Strong command of Algorithms, Data structures, Design patterns, and Product Architectural Design.
- Good understanding of DevOps, Cloud technologies, CI/CD, Serverless and Docker, preferable AWS
- Proven track record and expert in one of the field - DevOps/Frontend/Backend
- Excellent coding and debugging skills in any language with command on any one programming paradigm, preferred Javascript/Python/Go
- Experience with at least one of the Database systems - RDBMS and NoSQL
- Ability to document requirements and specifications.
- A naturally inquisitive and problem-solving mindset.
- Strong experience in using AGILE or SCRUM techniques to build quality software.
- Advantage: experience in React js, AWS, Nodejs, Golang, Apache Spark, ETL tool, data integration system, certification in AWS, worked in a Product company and involved in making it from scratch, Good communication skills, open-source contributions, proven competitive coding pro
As well as doing great work we have created an award-winning, fun and exciting workplace that people love to be, we are looking for people to join us who share our values and are:
- Open - Always up for new ideas and able to take and give feedback in a positive way.
- Driven - sets high goals, doesn’t give up, and make sacrifices to ensure that their job gets done on time and meets/exceeds expectations.
- Curious - Aware of new technologies and uses them to make new improvements in the Engineering ecosystem.
- Smart - Innovative and thinks out of the box, in difficult situations finds a way to succeed no matter what
- Responsible - takes ownership of tasks given and has a strong work ethic.
About Peak
In an age when becoming AI and data-driven is one of the most important things businesses must do, it can also be one of the most challenging. That’s where Peak comes in; our CODI system sits at the heart of our client’s businesses, enabling the rapid unification, modelling and - most importantly - use of data - helping decision makers make great commercial decisions, powered by AI. All supported by our world-class data science team.
Founded in 2014, Peak has grown rapidly, in line with the world’s fastest growing SaaS companies, winning numerous awards and attracted significant funding to support the company’s ongoing investment in machine learning and AI technologies. All to further our mission to become the world’s leading AI System business.
Headquartered in Manchester, Peak also has offices in London, Edinburgh, Jaipur and Brisbane. Our clients include some of the world's leading retailers, manufacturers and well-known brands alongside highly innovative and tech-savvy businesses. Peak is an Amazon Web Services (AWS) Partner, and holds Machine Learning Competency and Retail Competency status.





