Cutshort logo
Water treatment Jobs in Pune

11+ Water treatment Jobs in Pune | Water treatment Job openings in Pune

Apply to 11+ Water treatment Jobs in Pune on CutShort.io. Explore the latest Water treatment Job opportunities across top companies like Google, Amazon & Adobe.

icon
Timble Technologies

at Timble Technologies

1 recruiter
Preeti Bisht
Posted by Preeti Bisht
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Mumbai, Pune, Hyderabad, Nagpur, Chennai, Kolkata
0.6 - 6 yrs
₹1L - ₹5L / yr
Water treatment
FMCG
Sales
Field Sales
Territory management
+9 more

Job Description: Field Sales Executive

Company: DetoXyFi Technologies PVT Ltd

Product: Jal Kavach (Portable Water Filters)


Role Objective

Drive the distribution and sales of Jal Kavach by building a robust network of dealers and distributors. You will be the face of the brand on the field, ensuring our life-saving technology reaches every household.


Key Responsibilities

  • Channel Expansion: Identify and appoint new dealers, distributors, and retail partners.
  • Sales Growth: Achieve primary and secondary sales targets within your assigned territory.
  • Demonstrations: Conduct product demos to showcase the efficiency of our low-cost filters.
  • Relationship Management: Maintain strong ties with partners and ensure consistent product stock.
  • Market Reporting: Track competitor trends and provide daily field activity reports.


Required Skills & Experience

  • Experience: 1–5 years in Field/Channel Sales (Water Purifier or FMCG background preferred).
  • Hustle: Proven track record of territory mapping and network building.
  • Travel: Must be comfortable with extensive daily field travel.
  • Communication: Strong negotiation skills in Hindi and local languages.


Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram
7 - 10 yrs
₹21L - ₹30L / yr
Perforce
DevOps
skill iconGit
skill iconGitHub
skill iconPython
+7 more

JOB DETAILS:

* Job Title: Specialist I - DevOps Engineering

* Industry: Global Digital Transformation Solutions Provider

* Salary: Best in Industry

* Experience: 7-10 years

* Location: Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram

 

Job Description

Job Summary:

As a DevOps Engineer focused on Perforce to GitHub migration, you will be responsible for executing seamless and large-scale source control migrations. You must be proficient with GitHub Enterprise and Perforce, possess strong scripting skills (Python/Shell), and have a deep understanding of version control concepts.

The ideal candidate is a self-starter, a problem-solver, and thrives on challenges while ensuring smooth transitions with minimal disruption to development workflows.

 

Key Responsibilities:

  • Analyze and prepare Perforce repositories — clean workspaces, merge streams, and remove unnecessary files.
  • Handle large files efficiently using Git Large File Storage (LFS) for files exceeding GitHub’s 100MB size limit.
  • Use git-p4 fusion (Python-based tool) to clone and migrate Perforce repositories incrementally, ensuring data integrity.
  • Define migration scope — determine how much history to migrate and plan the repository structure.
  • Manage branch renaming and repository organization for optimized post-migration workflows.
  • Collaborate with development teams to determine migration points and finalize migration strategies.
  • Troubleshoot issues related to file sizes, Python compatibility, network connectivity, or permissions during migration.

 

Required Qualifications:

  • Strong knowledge of Git/GitHub and preferably Perforce (Helix Core) — understanding of differences, workflows, and integrations.
  • Hands-on experience with P4-Fusion.
  • Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
  • Proficiency in migration tools such as git-p4 fusion — installation, configuration, and troubleshooting.
  • Ability to identify and manage large files using Git LFS to meet GitHub repository size limits.
  • Strong scripting skills in Python and Shell for automating migration and restructuring tasks.
  • Experience in planning and executing source control migrations — defining scope, branch mapping, history retention, and permission translation.
  • Familiarity with CI/CD pipeline integration to validate workflows post-migration.
  • Understanding of source code management (SCM) best practices, including version history and repository organization in GitHub.
  • Excellent communication and collaboration skills for cross-team coordination and migration planning.
  • Proven practical experience in repository migration, large file management, and history preservation during Perforce to GitHub transitions.

 

Skills: Github, Kubernetes, Perforce, Perforce (Helix Core), Devops Tools

 

Must-Haves

Git/GitHub (advanced), Perforce (Helix Core) (advanced), Python/Shell scripting (strong), P4-Fusion (hands-on experience), Git LFS (proficient)

Read more
Mumbai, Pune, Hyderabad, Bengaluru (Bangalore), Panchkula, Mohali
5 - 8 yrs
₹10L - ₹20L / yr
skill iconPython
FastAPI
skill iconFlask
skill iconDjango
skill iconGit

Job Title: Python Developer (FastAPI)

Experience Required: 4+ years

Location: Pune, Bangalore, Hyderabad, Mumbai, Panchkula, Mohali 

Shift: Night Shift 6:30 pm to 3:30 AM IST

About the Role

We are seeking an experienced Python Developer with strong expertise in FastAPI to join our engineering team. The ideal candidate should have a solid background in backend development, RESTful API design, and scalable application development.


Required Skills & Qualifications

· 4+ years of professional experience in backend development with Python.

· Strong hands-on experience with FastAPI (or Flask/Django with migration experience).

· Familiarity with asynchronous programming in Python.

· Working knowledge of version control systems (Git).

· Good problem-solving and debugging skills.

· Strong communication and collaboration abilities.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shikha Srivastav
Posted by Shikha Srivastav
Pune
8 - 16 yrs
Best in industry
Zookeeper
kafka

Job Title: Kafka Architect

Experience range:- 8+ Years

Location:- Pune

Experience :

Kafka Architect with kafka connect, kafka Streaming, Ecosystem and any scripting language

Kafka Brokers, Kafka Connect, Schema Registry, and Zookeeper (or KRaft)

Read more
Ranka Jewellers
Ranka Jewellers
Posted by Ranka Jewellers
Pune
0 - 8 yrs
₹3L - ₹5L / yr
Wordpress

As an eCommerce Website Developer at Ranka Jewellers, you will be responsible for the development, management, and optimization of our online jewelry store. You will collaborate with our design and marketing teams to ensure that our website reflects the luxurious and timeless nature of our products while providing a smooth, secure, and enjoyable shopping experience for customers.


Key Responsibilities:

  • Develop & Maintain: Build and manage the eCommerce website, ensuring it reflects the brand’s luxury image while being functional, intuitive, and easy to navigate.
  • Platform Expertise: Work with platforms such as [Shopify, WooCommerce, Magento, etc.], ensuring smooth integration with our existing systems and tools.
  • Responsive Design: Implement responsive designs that provide an optimal user experience across all devices—desktop, mobile, and tablet.
  • Payment & Shipping Integration: Ensure seamless integration of payment gateways (e.g., Razorpay, Paytm, etc.) and shipping providers, with secure transactions.
  • SEO & Performance Optimization: Implement best practices for eCommerce SEO to enhance online visibility and drive organic traffic. Optimize website performance, load speed, and security.
  • UI/UX Enhancements: Continuously improve the user interface (UI) and user experience (UX) of the website, aligning it with the luxury aesthetic of Ranka Jewellers while making the purchasing process as simple as possible.
  • Product Management: Regularly update and manage product listings, images, descriptions, pricing, and promotions, ensuring everything is accurate and up-to-date.
  • Analytics & Reporting: Monitor and analyze key website performance metrics (e.g., sales conversion, traffic) to inform decisions and identify areas for improvement.
  • Collaboration: Work closely with our marketing and design teams to create promotional campaigns, seasonal launches, and online events, ensuring technical execution aligns with the brand vision.


Requirements:

  • Proven experience in eCommerce website development, with a strong portfolio showcasing previous work (experience in jewelry or luxury goods is a plus).
  • Proficiency in web development languages like HTML, CSS, JavaScript, and PHP.
  • Familiarity with leading eCommerce platforms like Shopify, WooCommerce, Magento, or similar.
  • Strong knowledge of SEO principles and eCommerce-specific SEO strategies.
  • Experience integrating payment gateways, shipping solutions, and third-party plugins.
  • Familiarity with Google Analytics, Google Tag Manager, and other performance tracking tools.
  • Attention to detail with an eye for design, ensuring the website reflects the high-end nature of Ranka Jewellers.
  • Strong problem-solving skills with the ability to handle multiple tasks and deadlines effectively.
  • Ability to work independently while collaborating with cross-functional teams.


Preferred Qualifications:

  • Experience with advanced JavaScript frameworks (e.g., React, Vue).
  • Knowledge of cloud hosting platforms (AWS, Google Cloud, etc.) or website security best practices.
  • Understanding of the luxury jewelry market and the unique needs of online jewelry sales (e.g., high-resolution product photography, secure transactions, customization options).
  • Ability to suggest and implement innovative features to improve the shopping experience for customers.


Why Ranka Jewellers?

  • At Ranka Jewellers, you will have the opportunity to work with a passionate and talented team in a fast-growing company that values creativity, craftsmanship, and customer satisfaction. We offer a competitive salary, flexible work arrangements, and an opportunity to contribute to the digital transformation of a trusted brand in the jewelry industry.


Read more
Leading Product & Service Based

Leading Product & Service Based

Agency job
via Tekfortune Inc by Ankit Uikey
Pune, Mohali
3 - 10 yrs
₹10L - ₹20L / yr
OTM
SQL
Oracle BI Publisher
Oracle ERP
Oracle Business Intelligence Suite Enterprise Edition (OBIEE)
+3 more

Title: Technical Analyst - OTM

Experience: 3-9 Years

Work Location: Mohali

Shift Timing: Rotational Shift 24x5

 

Notice Period: Immediate to 30 days Max

 

Key Skills: OTM, OBIEE, BI Publisher, Oracle ERP

 

Job Description:

The Oracle Transportation Management Technical Analyst will share the responsibility for design, implementation, and support of business solutions based on Emerson’s instance of Oracle Transportation Management commonly referred to as SCO (Supply Chain Optimization). The Technical Analyst utilizes expertise in Oracle Transportation Management to provide assistance in the ongoing implementation, enhancement, and support of SCO functionality.

 

Roles and Responsibilities:

  • Provide support (e.g., break/fix, how to expertise, enhancements, monitoring, testing, troubleshooting) for the SCO application.
  • Works collaboratively with Corporate Logistics and SCO IT Program/Project Managers to understand program requirements and assist with the evaluation of alternative solutions.
  • Assist with program rollout activities, including business unit and trading partner on-boarding, project coordination, status reporting and communication to program management.
  • Proactively monitors processes to identify trends; analyses/predicts trends and develops a long-range plan designed to resolve problems and prevent them from recurring to ensure high service levels.
  • Ensures SCO system documentation is complete and maintained.
  • Works effectively in a global highly matrixed team environment.

 

Skills & Experience Required:

  • 4 to 8 years of IT experience, including implementation of Oracle Transportation Management.
  • OTM Expert, both Functionally and technically (Setup configuration, Order Management, Shipment management, Financials, rates, master data, bulk planning parameters, VPDs, user configuration, screen set development, SQL queries, Tracking Events and working with CSV & XML files).
  • Hands on experience with triage of day-to-day OTM systems issues and providing resolution on complex issues.
  • Knowledge of Logistics management principles and processes.
  • Broad knowledge and experience with various ERP systems. Working knowledge of Oracle eBusiness Suite (Procurement, Shipping, XML Gateway) is highly preferred.
  • Working knowledge of BI Publisher, FTI/OAC, OBIEE and ETL.
  • Good knowledge of EDI and any other Middleware systems.
  • Strong customer service orientation with strong written and verbal communication skills, including comfort with presenting to diverse technical and non-technical audiences at all levels of the organization.
  • Ability to multi-task and work within diverse multi-disciplinary global project teams.
  • Detail-oriented with strong problem-solving skills.
  • Comfortable with performing detailed data analysis to identify opportunities and gain higher level insight.
  • Knowledge on GTM (Global Trade Management) will be a plus.
  • Some travel might be required.

 

Education

  • Bachelor’s degree in computer science, Information Systems, or another related field.


Read more
AxionConnect Infosolutions Pvt Ltd
Shweta Sharma
Posted by Shweta Sharma
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur, Chennai
5.5 - 7 yrs
₹20L - ₹25L / yr
skill iconDjango
skill iconFlask
Snowflake
Snow flake schema
SQL
+4 more

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur

Notice period: Immediate - 15 days

 

1.      Python Developer with Snowflake

 

Job Description :


  1. 5.5+ years of Strong Python Development Experience with Snowflake.
  2. Strong hands of experience with SQL ability to write complex queries.
  3. Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
  4.  Development of Data Analysis, Data Processing engines using Python
  5. Good Experience in Data Transformation using Python. 
  6.  Experience in Snowflake data load using Python.
  7.  Experience in creating user-defined functions in Snowflake.
  8.  Snowsql implementation.
  9.  Knowledge of query performance tuning will be added advantage.
  10. Good understanding of Datawarehouse (DWH) concepts.
  11.  Interpret/analyze business requirements & functional specification
  12.  Good to have DBT, FiveTran, and AWS Knowledge.
Read more
YuktaMedia

at YuktaMedia

1 video
3 recruiters
Aditya Bhelande
Posted by Aditya Bhelande
Pune
3 - 5 yrs
₹4L - ₹6L / yr
ad serving
DFP
DART
DFA
Double Click Bid Manager
+3 more
Solid hands on 3-4 years of DFP, AdX, DDM, DDM experience. Should setting up GPT, Conversion pixels, Audience pixels, etc. Setting up campaigns along with creative. Technical debugging of ad calls. Should be familiar with Inventory forecasting, Ad Unit, Placement creation, optimization, etc. Be part of the growing technology team. Tremendous opportunity to make impact on business and ad-tech industry. Benefits: Build shit that matters!!! Experience the impact of your hard work Work hard and party harder Work with a extremely committed group of people Explore and implement new technologies
Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
Hexaware Technologies
Agency job
via telamonhr by Akanksha Saxena
Mumbai, Chennai, Pune, Bengaluru (Bangalore)
1 - 3 yrs
₹9L - ₹10L / yr
skill iconAngularJS (1.x)
skill iconReact.js
TypeScript
Ember.js
skill iconPython
+9 more

1.       Strong knowledge in Front end scripting like EJS, JavaScript, Jquery

 

2.       Proficiency with fundamental front-end languages such as HTML, CSS.

 

3.       Familiarity with JavaScript frameworks such as Angular JS, React, and Amber.

 

4.       Proficiency with server-side languages such as Python / Ruby / Java / PHP/ .Net

 

5.       Good Understand with database technology such as MySQL, Oracle, and MongoDB.

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune, Hyderabad
3 - 12 yrs
₹5L - ₹25L / yr
Apache Kafka
Big Data
Hadoop
Apache Hive
skill iconJava
+1 more

Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.

 

Must Have Skills

  • Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
  • Leading in the identification, isolation, resolution and communication of problems within the production environment.
  • Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
  • Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
  • Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
  • Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
  • Works on multiple platforms and multiple projects concurrently.
  • Performs code and unit testing for complex scope modules, and projects
  • Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
  • Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
  • Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
  • Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector,  JMS source connectors, Tasks, Workers, converters, Transforms.
  • Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
  • Working knowledge on Kafka Rest proxy.
  • Ensure optimum performance, high availability and stability of solutions.
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
  • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.  Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem. 
  • Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
  • Ability to perform data related benchmarking, performance analysis and tuning.
  • Strong skills in In-memory applications, Database Design, Data Integration.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort