Cutshort logo
Data engineering Jobs in Pune

10+ Data engineering Jobs in Pune | Data engineering Job openings in Pune

Apply to 10+ Data engineering Jobs in Pune on Explore the latest Data engineering Job opportunities across top companies like Google, Amazon & Adobe.

Sahaj AI Software

at Sahaj AI Software

1 video
6 recruiters
Soumya  Tripathy
Posted by Soumya Tripathy
Bengaluru (Bangalore), Chennai, Pune
7 - 30 yrs
Best in industry
skill iconMachine Learning (ML)
Operations research
Applied mathematics
Applied Statist
Data mining
+2 more

Job Description

Data scientist with a strong background in data mining, machine learning, recommendation systems, and statistics. Should possess signature strengths of a qualified mathematician with ability to apply concepts of Mathematics, Applied Statistics, with specialisation in one or more of NLP, Computer Vision, Speech, Data mining to develop models that provide effective solution.A strong data engineering background with hands-on coding capabilities is needed to own and deliver outcomes.

A Master’s or PhD Degree in a highly quantitative field (Computer Science, Machine Learning, Operational Research, Statistics, Mathematics, etc.) or equivalent experience, 7+ years of industry experience in predictive modelling, data science and analysis, with prior experience in a ML or data scientist role and a track record of building ML or DL models.

Responsibilities and skills:

  • Work with our customers to deliver a ML / DL project from beginning to end, including understanding the business need, aggregating data, exploring data, building & validating predictive models, and deploying completed models to deliver business impact to the organisation.
  • Selecting features, building and optimising classifiers using ML techniques.
  • Data mining using state-of-the-art methods, create text mining pipelines to clean & process large unstructured datasets to reveal high quality information and hidden insights using machine learning techniques.
  • Should be able to appreciate and work on:

Computer Vision problems – for example extract rich information from images to categorise and process visual data— Develop machine learning algorithms for object and image classification, Experience in using DBScan, PCA, Random Forests and Multinomial Logistic Regression to select the best features to classify objects.


Deep understanding of NLP such as fundamentals of information retrieval, deep learning approaches, transformers, attention models, text summarisation, attribute extraction, etc. Preferable experience in one or more of the following areas: recommender systems, moderation of user generated content, sentiment analysis, etc.


Experience of having worked in these areas : speech recognition, speech to text and vice versa, understanding NLP and IR, text summarisation, statistical and deep learning approaches to text processing.

  • Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc. Needs to appreciate deep learning frameworks like MXNet, Caffe 2, Keras, Tensorflow.
  • Experience in working with GPUs to develop models, handling terabyte size datasets.
  • Experience with common data science toolkits, such as R, Weka, NumPy, MatLab, mlr, mllib, Scikit-learn, caret etc - excellence in at least one of these is highly desirable.
  • Should be able to work hands-on in Python, R etc. Should closely collaborate & work with engineering teams to iteratively analyse data using Scala, Spark, Hadoop, Kafka, Storm etc.
  • Experience with NoSQL databases and familiarity with data visualisation tools will be of great advantage.

Read more
Sahaj AI Software

at Sahaj AI Software

1 video
6 recruiters
Priya R
Posted by Priya R
Chennai, Bengaluru (Bangalore), Pune
8 - 14 yrs
Best in industry
Data engineering
skill iconPython
skill iconScala
Apache Spark
+3 more

About Us

Sahaj Software is an artisanal software engineering firm built on the values of trust, respect, curiosity, and craftsmanship, and delivering purpose-built solutions to drive data-led transformation for organisations. Our emphasis is on craft as we create purpose-built solutions, leveraging Data Engineering, Platform Engineering and Data Science with a razor-sharp focus to solve complex business and technology challenges and provide customers with a competitive edge

About The Role

As a Data Engineer, you’ll feel at home if you are hands-on, grounded, opinionated and passionate about delivering comprehensive data solutions that align with modern data architecture approaches. Your work will range from building a full data platform to building data pipelines or helping with data architecture and strategy. This role is ideal for those looking to have a large impact and huge scope for growth, while still being hands-on with technology. We aim to allow growth without becoming “post-technical”.


  • Collaborate with Data Scientists and Engineers to deliver production-quality AI and Machine Learning systems
  • Build frameworks and supporting tooling for data ingestion from a complex variety of sources
  • Consult with our clients on data strategy, modernising their data infrastructure, architecture and technology
  • Model their data for increased visibility and performance
  • You will be given ownership of your work, and are encouraged to propose alternatives and make a case for doing things differently; our clients trust us and we manage ourselves.
  • You will work in short sprints to deliver working software
  • You will be working with other data engineers in Sahaj and work on building Data Engineering capability across the organisation

You can read more about what we do and how we think here:

Skills you’ll need

  • Demonstrated experience as a Senior Data Engineer in complex enterprise environments
  • Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala
  • Demonstrated experience in the design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake
  • Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical
  • A nuanced understanding of code quality, maintainability and practices like Test Driven Development
  • Ability to deliver an application end to end; having an opinion on how your code should be built, packaged and deployed using CI/CD
  • Understanding of Cloud platforms, DevOps, GitOps, and Containers

What will you experience as a culture at Sahaj?

At Sahaj, people's collective stands for a shared purpose where everyone owns the dreams, ideas, ideologies, successes, and failures of the organisation - a synergy that is rooted in the ethos of honesty, respect, trust, and equitability. At Sahaj, you will experience

  • Creativity
  • Ownership
  • Curiosity
  • Craftsmanship
  • A culture of trust, respect and transparency
  • Opportunity to collaborate with some of the finest minds in the industry
  • Work across multiple domains

What are the benefits of being at Sahaj?

  •  Unlimited leaves
  •  Life Insurance & Private Health insurance paid by Sahaj
  • Stock options
  • No hierarchy
  • Open Salaries 

Read more


Agency job
via Bohiyaanam Talent Solutions LLP by TrishaDutt Tekgminus
Pune, Mumbai, Bengaluru (Bangalore), Indore, Kolkata
6 - 7 yrs
₹12L - ₹18L / yr
Data engineering

I am looking for Mulesoft Developer for a reputed MNC


Experience: 6+ Years

Relevant experience: 4 Years

Location : Pune, Mumbai, Bangalore, Indore, Kolkata




Experience: 6+ Years

Relevant experience: 4 Years

Location : Pune, Mumbai, Bangalore, Indore, Kolkata

Read more
Rohit S
Posted by Rohit S
Chennai, Pune, Bengaluru (Bangalore), Gurugram
11 - 16 yrs
₹20L - ₹32L / yr
Data Warehouse (DWH)
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Data engineering
Data migration
+1 more
• Engages with Leadership of Tredence’s clients to identify critical business problems, define the need for data engineering solutions and build strategy and roadmap
• S/he possesses a wide exposure to complete lifecycle of data starting from creation to consumption
• S/he has in the past built repeatable tools / data-models to solve specific business problems
• S/he should have hand-on experience of having worked on projects (either as a consultant or with in a company) that needed them to
o Provide consultation to senior client personnel o Implement and enhance data warehouses or data lakes.
o Worked with business teams or was a part of the team that implemented process re-engineering driven by data analytics/insights
• Should have deep appreciation of how data can be used in decision-making
• Should have perspective on newer ways of solving business problems. E.g. external data, innovative techniques, newer technology
• S/he must have a solution-creation mindset.
Ability to design and enhance scalable data platforms to address the business need
• Working experience on data engineering tool for one or more cloud platforms -Snowflake, AWS/Azure/GCP
• Engage with technology teams from Tredence and Clients to create last mile connectivity of the solutions
o Should have experience of working with technology teams
• Demonstrated ability in thought leadership – Articles/White Papers/Interviews
Mandatory Skills Program Management, Data Warehouse, Data Lake, Analytics, Cloud Platform
Read more
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Data modeling
Data engineering
+1 more


• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.


About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
Celebal Technologies

at Celebal Technologies

2 recruiters
Payal Hasnani
Posted by Payal Hasnani
Jaipur, Noida, Gurugram, Delhi, Ghaziabad, Faridabad, Pune, Mumbai
5 - 15 yrs
₹7L - ₹25L / yr
Data engineering
Big Data
+4 more
Job Responsibilities:

• Project Planning and Management
o Take end-to-end ownership of multiple projects / project tracks
o Create and maintain project plans and other related documentation for project
objectives, scope, schedule and delivery milestones
o Lead and participate across all the phases of software engineering, right from
requirements gathering to GO LIVE
o Lead internal team meetings on solution architecture, effort estimation, manpower
planning and resource (software/hardware/licensing) planning
o Manage RIDA (Risks, Impediments, Dependencies, Assumptions) for projects by
developing effective mitigation plans
• Team Management
o Act as the Scrum Master
o Conduct SCRUM ceremonies like Sprint Planning, Daily Standup, Sprint Retrospective
o Set clear objectives for the project and roles/responsibilities for each team member
o Train and mentor the team on their job responsibilities and SCRUM principles
o Make the team accountable for their tasks and help the team in achieving them
o Identify the requirements and come up with a plan for Skill Development for all team
• Communication
o Be the Single Point of Contact for the client in terms of day-to-day communication
o Periodically communicate project status to all the stakeholders (internal/external)
• Process Management and Improvement
o Create and document processes across all disciplines of software engineering
o Identify gaps and continuously improve processes within the team
o Encourage team members to contribute towards process improvement
o Develop a culture of quality and efficiency within the team

Must have:
• Minimum 08 years of experience (hands-on as well as leadership) in software / data engineering
across multiple job functions like Business Analysis, Development, Solutioning, QA, DevOps and
Project Management
• Hands-on as well as leadership experience in Big Data Engineering projects
• Experience developing or managing cloud solutions using Azure or other cloud provider
• Demonstrable knowledge on Hadoop, Hive, Spark, NoSQL DBs, SQL, Data Warehousing, ETL/ELT,
DevOps tools
• Strong project management and communication skills
• Strong analytical and problem-solving skills
• Strong systems level critical thinking skills
• Strong collaboration and influencing skills

Good to have:
• Knowledge on PySpark, Azure Data Factory, Azure Data Lake Storage, Synapse Dedicated SQL
Pool, Databricks, PowerBI, Machine Learning, Cloud Infrastructure
• Background in BFSI with focus on core banking
• Willingness to travel

Work Environment
• Customer Office (Mumbai) / Remote Work

• UG: B. Tech - Computers / B. E. – Computers / BCA / B.Sc. Computer Science
Read more
Mobile Programming India Pvt Ltd

at Mobile Programming India Pvt Ltd

1 video
17 recruiters
Pawan Tiwari
Posted by Pawan Tiwari
Remote, Bengaluru (Bangalore), Chennai, Pune, Gurugram, Mohali, Dehradun
4 - 7 yrs
₹10L - ₹15L / yr
Data engineering
Data Engineer
skill iconDjango
skill iconPython

Looking Data Enginner for our OWn organization-

Notice Period- 15-30 days
CTC- upto 15 lpa


Preferred Technical Expertise 

  1. Expertise in Python programming.
  2. Proficient in Pandas/Numpy Libraries. 
  3. Experience with Django framework and API Development.
  4. Proficient in writing complex queries using SQL
  5. Hands on experience with Apache Airflow.
  6. Experience with source code versioning tools such as GIT, Bitbucket etc.

 Good to have Skills:

  1. Create and maintain Optimal Data Pipeline Architecture
  2. Experienced in handling large structured data.
  3. Demonstrated ability in solutions covering data ingestion, data cleansing, ETL, Data mart creation and exposing data for consumers.
  4. Experience with any cloud platform (GCP is a plus)
  5. Experience with JQuery, HTML, Javascript, CSS is a plus.
If Intersted , Kindly share Your CV
Read more

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune, Hyderabad
7 - 12 yrs
₹12L - ₹33L / yr
Big Data
Apache Spark
Apache Hive
+3 more

Job description

Role : Lead Architecture (Spark, Scala, Big Data/Hadoop, Java)

Primary Location : India-Pune, Hyderabad

Experience : 7 - 12 Years

Management Level: 7

Joining Time: Immediate Joiners are preferred

  • Attend requirements gathering workshops, estimation discussions, design meetings and status review meetings
  • Experience of Solution Design and Solution Architecture for the data engineer model to build and implement Big Data Projects on-premises and on cloud.
  • Align architecture with business requirements and stabilizing the developed solution
  • Ability to build prototypes to demonstrate the technical feasibility of your vision
  • Professional experience facilitating and leading solution design, architecture and delivery planning activities for data intensive and high throughput platforms and applications
  • To be able to benchmark systems, analyses system bottlenecks and propose solutions to eliminate them
  • Able to help programmers and project managers in the design, planning and governance of implementing projects of any kind.
  • Develop, construct, test and maintain architectures and run Sprints for development and rollout of functionalities
  • Data Analysis, Code development experience, ideally in Big Data Spark, Hive, Hadoop, Java, Python, PySpark,
  • Execute projects of various types i.e. Design, development, Implementation and migration of functional analytics Models/Business logic across architecture approaches
  • Work closely with Business Analysts to understand the core business problems and deliver efficient IT solutions of the product
  • Deployment sophisticated analytics program of code using any of cloud application.

Perks and Benefits we Provide!

  • Working with Highly Technical and Passionate, mission-driven people
  • Subsidized Meals & Snacks
  • Flexible Schedule
  • Approachable leadership
  • Access to various learning tools and programs
  • Pet Friendly
  • Certification Reimbursement Policy
  • Check out more about us on our website below!

Read more

at 1CH

1 recruiter
Sathish Sukumar
Posted by Sathish Sukumar
Chennai, Bengaluru (Bangalore), Hyderabad, NCR (Delhi | Gurgaon | Noida), Mumbai, Pune
4 - 15 yrs
₹10L - ₹25L / yr
Data engineering
Data engineer
+3 more
  • Expertise in designing and implementing enterprise scale database (OLTP) and Data warehouse solutions.
  • Hands on experience in implementing Azure SQL Database, Azure SQL Date warehouse (Azure Synapse Analytics) and big data processing using Azure Databricks and Azure HD Insight.
  • Expert in writing T-SQL programming for complex stored procedures, functions, views and query optimization.
  • Should be aware of Database development for both on-premise and SAAS Applications using SQL Server and PostgreSQL.
  • Experience in ETL and ELT implementations using Azure Data Factory V2 and SSIS.
  • Experience and expertise in building machine learning models using Logistic and linear regression, Decision tree  and Random forest Algorithms.
  • PolyBase queries for exporting and importing data into Azure Data Lake.
  • Building data models both tabular and multidimensional using SQL Server data tools.
  • Writing data preparation, cleaning and processing steps using Python, SCALA, and R.
  • Programming experience using python libraries NumPy, Pandas and Matplotlib.
  • Implementing NOSQL databases and writing queries using cypher.
  • Designing end user visualizations using Power BI, QlikView and Tableau.
  • Experience working with all versions of SQL Server 2005/2008/2008R2/2012/2014/2016/2017/2019
  • Experience using the expression languages MDX and DAX.
  • Experience in migrating on-premise SQL server database to Microsoft Azure.
  • Hands on experience in using Azure blob storage, Azure Data Lake Storage Gen1 and Azure Data Lake Storage Gen2.
  • Performance tuning complex SQL queries, hands on experience using SQL Extended events.
  • Data modeling using Power BI for Adhoc reporting.
  • Raw data load automation using T-SQL and SSIS
  • Expert in migrating existing on-premise database to SQL Azure.
  • Experience in using U-SQL for Azure Data Lake Analytics.
  • Hands on experience in generating SSRS reports using MDX.
  • Experience in designing predictive models using Python and SQL Server.
  • Developing machine learning models using Azure Databricks and SQL Server
Read more

at MedCords

6 recruiters
Monika Goel
Posted by Monika Goel
Pune, Kota
3 - 10 yrs
₹10L - ₹35L / yr
skill iconPython
skill iconGo Programming (Golang)
skill iconJava
AWS CloudFormation
Object Oriented Programming (OOPs)
+8 more

Job Description-

Backend Developer- Senior


Experience - 3-6 years

Location: Pune/Kota


Minimum Qualifications:


- BE/B.Tech or ME/M.Tech in Computer Science.

- Must have “Can Do Attitude” towards work

- Must have work exp of 3-6 years

- Must have programming exp of 1-2 years in any of Python/Golang/Java languages

- Must have worked in product based company

- Ready to work in a startup and adaptable to a dynamic environment

- Ready to accept ad-hoc requirements and track them till they get implemented

- Ready to learn new technologies like Andriod, Angular, etc.

- Good at HTTP basics, OOPs concepts, data structures, algorithms, networking and

security aspects

- Ability to write clean code and maintain it

- Good at SQL/No-SQL databases


Preferred Qualifications:

- Experience in any good product based startup

- Experience in working with the team and managing a small team of 2-5 associates

- Experience in being a mentor for co-developers

- Experience in design/developing scalable systems.

- Experience in public cloud platforms services/APIs of AWS, Google Cloud, etc.

- Experience in data engineering

- Experience in SOA/Microservice architecture development



- Design and develop scalable services and APIs in Python/Golang

- Always maintain the services secure

- Should optimize APIs for mobile data and apps

- Use off-the-shelf and state-of-the-art services for faster development of product

- Guide team members with designs

- Take the end to end ownership of features and resolve customer issues on priority

- Mentor/guide/monitor junior developer

- Expertise Android/Angular to the required extent and guide app developers while

designing APIs


Opportunities in the role:

- LearnAngular, Python, Node.js, Golang, ELK stack, MEAN/MERN

- Work on AWS, Azure, Google Cloud Platform

- Work on databases like RDS, MongoDB, Big Table & DynamoDB, Redis, Aerospike

- Experience with SQL/ NoSQL Databases (RDS, DynamoDB, Google Datastore, Redis)

- Experience with ELK stack.

- Fast prototyping of proof-of concept features/application based on a brief

- Work on data engineering

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort