Cutshort logo

11+ Fusion Jobs in Pune | Fusion Job openings in Pune

Apply to 11+ Fusion Jobs in Pune on CutShort.io. Explore the latest Fusion Job opportunities across top companies like Google, Amazon & Adobe.

icon
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram
7 - 10 yrs
₹21L - ₹30L / yr
Perforce
DevOps
skill iconGit
skill iconGitHub
skill iconPython
+7 more

JOB DETAILS:

* Job Title: Specialist I - DevOps Engineering

* Industry: Global Digital Transformation Solutions Provider

* Salary: Best in Industry

* Experience: 7-10 years

* Location: Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram

 

Job Description

Job Summary:

As a DevOps Engineer focused on Perforce to GitHub migration, you will be responsible for executing seamless and large-scale source control migrations. You must be proficient with GitHub Enterprise and Perforce, possess strong scripting skills (Python/Shell), and have a deep understanding of version control concepts.

The ideal candidate is a self-starter, a problem-solver, and thrives on challenges while ensuring smooth transitions with minimal disruption to development workflows.

 

Key Responsibilities:

  • Analyze and prepare Perforce repositories — clean workspaces, merge streams, and remove unnecessary files.
  • Handle large files efficiently using Git Large File Storage (LFS) for files exceeding GitHub’s 100MB size limit.
  • Use git-p4 fusion (Python-based tool) to clone and migrate Perforce repositories incrementally, ensuring data integrity.
  • Define migration scope — determine how much history to migrate and plan the repository structure.
  • Manage branch renaming and repository organization for optimized post-migration workflows.
  • Collaborate with development teams to determine migration points and finalize migration strategies.
  • Troubleshoot issues related to file sizes, Python compatibility, network connectivity, or permissions during migration.

 

Required Qualifications:

  • Strong knowledge of Git/GitHub and preferably Perforce (Helix Core) — understanding of differences, workflows, and integrations.
  • Hands-on experience with P4-Fusion.
  • Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
  • Proficiency in migration tools such as git-p4 fusion — installation, configuration, and troubleshooting.
  • Ability to identify and manage large files using Git LFS to meet GitHub repository size limits.
  • Strong scripting skills in Python and Shell for automating migration and restructuring tasks.
  • Experience in planning and executing source control migrations — defining scope, branch mapping, history retention, and permission translation.
  • Familiarity with CI/CD pipeline integration to validate workflows post-migration.
  • Understanding of source code management (SCM) best practices, including version history and repository organization in GitHub.
  • Excellent communication and collaboration skills for cross-team coordination and migration planning.
  • Proven practical experience in repository migration, large file management, and history preservation during Perforce to GitHub transitions.

 

Skills: Github, Kubernetes, Perforce, Perforce (Helix Core), Devops Tools

 

Must-Haves

Git/GitHub (advanced), Perforce (Helix Core) (advanced), Python/Shell scripting (strong), P4-Fusion (hands-on experience), Git LFS (proficient)

Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Pune
4 - 5 yrs
₹10L - ₹15L / yr
skill iconAmazon Web Services (AWS)
Terraform
IAAC
Bash
skill iconPython
+3 more

 • Strong hands-on experience with AWS services.


 • Expertise in Terraform and IaC principles. 


• Experience building CI/CD pipelines and working with Git. 


• Proficiency with Docker and Kubernetes. 


• Solid understanding of Linux administration, networking fundamentals, and IAM. 


• Familiarity with monitoring and observability tools (CloudWatch, Prometheus, Grafana, ELK, Datadog). 


• Knowledge of security and compliance tools (Trivy, SonarQube, Checkov, Snyk).


 • Scripting experience in Bash, Python, or PowerShell.


 • Exposure to GCP, Azure, or multi-cloud architectures is a plus.

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Pune
4 - 7 yrs
₹4L - ₹16L / yr
skill iconAmazon Web Services (AWS)
DevOps
skill iconDocker
skill iconKubernetes
skill iconJenkins
+1 more

Job Summary:

We are seeking a highly skilled and proactive DevOps Engineer with 4+ years of experience to join our dynamic team. This role requires strong technical expertise across cloud infrastructure, CI/CD pipelines, container orchestration, and infrastructure as code (IaC). The ideal candidate should also have direct client-facing experience and a proactive approach to managing both internal and external stakeholders.


Key Responsibilities:

  • Collaborate with cross-functional teams and external clients to understand infrastructure requirements and implement DevOps best practices.
  • Design, build, and maintain scalable cloud infrastructure on AWS (EC2, S3, RDS, ECS, etc.).
  • Develop and manage infrastructure using Terraform or CloudFormation.
  • Manage and orchestrate containers using Docker and Kubernetes (EKS).
  • Implement and maintain CI/CD pipelines using Jenkins or GitHub Actions.
  • Write robust automation scripts using Python and Shell scripting.
  • Monitor system performance and availability, and ensure high uptime and reliability.
  • Execute and optimize SQL queries for MSSQL and PostgreSQL databases.
  • Maintain clear documentation and provide technical support to stakeholders and clients.

Required Skills:

  • Minimum 4+ years of experience in a DevOps or related role.
  • Proven experience in client-facing engagements and communication.
  • Strong knowledge of AWS services – EC2, S3, RDS, ECS, etc.
  • Proficiency in Infrastructure as Code using Terraform or CloudFormation.
  • Hands-on experience with Docker and Kubernetes (EKS).
  • Strong experience in setting up and maintaining CI/CD pipelines with Jenkins or GitHub.
  • Solid understanding of SQL and working experience with MSSQL and PostgreSQL.
  • Proficient in Python and Shell scripting.

Preferred Qualifications:

AWS Certifications (e.g., AWS Certified DevOps Engineer) are a plus.

Experience working in Agile/Scrum environments.

Strong problem-solving and analytical skills.


Read more
Hiret Consulting
Sanikha M
Posted by Sanikha M
Pune
6 - 10 yrs
₹10L - ₹15L / yr
Windows Azure
Data Structures
Finance
Insurance

Roles and Responsibilities: 

▪ Data Pipeline Development: Build, deploy, and maintain efficient ETL/ELT pipelines using Azure 

Data Factory, Data Factory & Azure Synapse Analytics. 

▪ We are only looking for senior candidates with over 5 yrs of relevant exp with ample client 

facing exp.

· Finance/Insurance experience is also a must. 

▪ Data Modelling & Warehousing: Design and optimize data models, warehouses, and lakes for 

structured/unstructured data. 

▪ SQL & Query Optimization: Write complex SQL queries, optimize performance, and manage 

databases. · Python Automation: Develop scripts for data processing, automation, and 

integration using Python (Pandas, NumPy). 

  

Technical Skills: 

▪ Cloud Technologies: Azure Synapse Analytics, Azure Fabric, Azure Databricks and AWS(good to 

have) 

▪ Knowledge of Python, Pyspark, SQL, ETL concepts 

▪ Good understanding of Insurance Operations and KPI reporting is an advantage. 

Read more
Deqode

at Deqode

1 recruiter
Mokshada Solanki
Posted by Mokshada Solanki
Bengaluru (Bangalore), Mumbai, Pune, Gurugram
4 - 5 yrs
₹4L - ₹20L / yr
SQL
skill iconAmazon Web Services (AWS)
Migration
PySpark
ETL

Job Summary:

Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.


Key Responsibilities:

  • Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
  • Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
  • Work on data migration tasks in AWS environments.
  • Monitor and improve database performance; automate key performance indicators and reports.
  • Collaborate with cross-functional teams to support data integration and delivery requirements.
  • Write shell scripts for automation and manage ETL jobs efficiently.


Required Skills:

  • Strong experience with MySQL, complex SQL queries, and stored procedures.
  • Hands-on experience with AWS Glue, PySpark, and ETL processes.
  • Good understanding of AWS ecosystem and migration strategies.
  • Proficiency in shell scripting.
  • Strong communication and collaboration skills.


Nice to Have:

  • Working knowledge of Python.
  • Experience with AWS RDS.



Read more
Opcito Technologies

at Opcito Technologies

1 video
4 recruiters
Aniket Bangale
Posted by Aniket Bangale
Pune
5 - 8 yrs
₹10L - ₹15L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Ansible
+2 more
We are looking for an DevOps Engineer with 4+ years of experience. He/she should be selfmotivated, go-getter, out of the box thinker, and ready to work in a high-energy
environment. He/she must demonstrate a high level of ownership, integrity, and leadership
skills and be flexible and adaptive with a strong desire to learn & excel.

Required Skills:

  • Strong experience working with tools and platforms like Helm charts, Circle CI, Jenkins,
  • and/or Codefresh
  • Excellent knowledge of AWS offerings around Cloud and DevOps
  • Strong expertise in containerization platforms like Docker and container orchestration platforms like Kubernetes & Rancher
  • Should be familiar with leading Infrastructure as Code tools such as Terraform, CloudFormation, etc.
  • Strong experience in Python, Shell Scripting, Ansible, and Terraform
  • Good command over monitoring tools like Datadog, Zabbix, Elk, Grafana, CloudWatch, Stackdriver, Prometheus, JFrog, Nagios, etc.
  • Experience with Linux/Unix systems administration.
Read more
SL

SL

Agency job
via Bohiyaanam Talent Solutions by Snehal Agarwal
Pune, Bengaluru (Bangalore), Nagpur, Indore, Goa
6 - 15 yrs
₹1L - ₹15L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
CI/CD
Hii All,

We are hiring for https://www.linkedin.com/feed/hashtag/?keywords=devops&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#Devops Engineer for a reputed https://www.linkedin.com/feed/hashtag/?keywords=mnc&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#MNC

Job Description:
Total exp- 6+Years
Must have:
Minimum 3-4 years hands-on experience in https://www.linkedin.com/feed/hashtag/?keywords=kubernetes&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#Kubernetes and https://www.linkedin.com/feed/hashtag/?keywords=docker&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#Docker
Proficiency in https://www.linkedin.com/feed/hashtag/?keywords=aws&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#AWS Cloud
Good to have Kubernetes admin certification

Job Responsibilities:
Responsible for managing Kubernetes cluster
Deploying infrastructure for the project
Build https://www.linkedin.com/feed/hashtag/?keywords=cicd&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#CICD pipeline

Looking for https://www.linkedin.com/feed/hashtag/?keywords=immediate&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#Immediate Joiners only
Location: Pune
Salary: As per market standards
Mode: https://www.linkedin.com/feed/hashtag/?keywords=work&;highlightedUpdateUrns=urn%3Ali%3Aactivity%3A7003255016740294656" target="_blank">#Work from office
Read more
Sela Technology Solutions Pvt Ltd
Himali Kirve
Posted by Himali Kirve
Remote, Pune
5 - 15 yrs
₹15L - ₹25L / yr
DevOps
skill iconKubernetes
Terraform
Ansible
CI/CD
+2 more

Devops Engineer  Position - 3+ years
Kubernetes, Helm - 3+ years (dev & administration)
Monitoring platform setup experience - Prometheus, Grafana
Azure/ AWS/ GCP  Cloud experience - 1+ years.
Ansible/Terraform/Puppet - 1+ years
CI/CD - 3+ years
Read more
Hiring for one of the product based org for PAN India

Hiring for one of the product based org for PAN India

Agency job
via Natalie Consultants by Swati Bansal
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Ranchi, Patna, Kolkata, Gandhinagar, Ahmedabad, Indore, Lucknow, Bhopal, Jaipur, Jodhpur
2 - 6 yrs
₹3L - ₹10L / yr
DevOps
CI/CD
skill iconDocker
skill iconKubernetes
skill iconAmazon Web Services (AWS)
+2 more
We are looking for a proficient, passionate, and skilled DevOps Specialist. You will have the opportunity to build an in-depth understanding of our client's business problems and then implement organizational strategies to resolve the same.

Skills required:
Strong knowledge and experience of cloud infrastructure (AWS, Azure or GCP), systems, network design, and cloud migration projects.
Strong knowledge and understanding of CI/CD processes tools (Jenkins/Azure DevOps) is a must.
Strong knowledge and understanding of Docker & Kubernetes is a must.
Strong knowledge of Python, along with one more language (Shell, Groovy, or Java).
Strong prior experience using automation tools like Ansible, Terraform.
Architect systems, infrastructure & platforms using Cloud Services.
Strong communication skills. Should have demonstrated the ability to collaborate across teams and organizations.

Benefits of working with OpsTree Solutions:

Opportunity to work on the latest cutting edge tools/technologies in DevOps
Knowledge focused work culture
Collaboration with very enthusiastic DevOps experts
High growth trajectory
Opportunity to work with big shots in the IT industry
Read more
Saviant Consulting

at Saviant Consulting

2 recruiters
Jyoti Raikar
Posted by Jyoti Raikar
Pune
7 - 15 yrs
₹10L - ₹20L / yr
DevOps
skill iconKubernetes
azure devops
Position: Technology Lead - DevOps

Position Summary:

Technology Lead provides technical leadership with in-depth DevOps experience and is responsible for enabling delivery of high-quality projects to Saviant clients through highly effective DevOps process. This is a highly technical role, with a focus on analysing, designing, documenting, and implementing a complete DevOps process for enterprise applications using the most advanced technology stacks, methodologies, and best practices within the agreed timelines.
Individuals in this role will need to have good technical and communication skills and strive to be on the cutting edge, innovate, and explore to deliver quality solutions to Saviant Clients.

Your Role & Responsibilities at Saviant:

• Design, analyze, document, and develop the technical architecture for on-premise as well as cloud-based DevOps solutions around customers’ business problems.
• Lead end to end process and setup implementation of configuration management, CI, CD, and monitoring platforms.
• Conduct reviews of design and implementation of DevOps processes while establishing, and maintaining best practices
• Setup new processes to improve the quality of development, delivery and deployment processes
• Provide technical support and guidance to project team members.
• Upgrade by learning technologies beyond traditional area of expertise
• Contribute to pre-sales, proposal creation, POCs, technology incubation from technical and architecture perspective
• Participate in recruitment and people development initiatives.
Job Requirements/Qualifications:
• Educational Qualification: BE, BTech, MTech, MCA from a reputed institute • 6 to 8 years of hands-on experience of the DevOps process using technologies like Dot Net Core, Python, C#, MVC, ReactJS, Python, Android, IOS, Linux, Windows
• Strong hands-on experience of the full life cycle of DevOps: DevOps Orchestration/Configuration/Security/CI-CD/Release Management and Environment management • Solid hands-on knowledge of DevOps technologies and tools such as Jenkins, Spinnaker, Azure for DevOps, Chef, Puppet, JIRA, TFS, Git, SVN, various scripting tools, etc. • Solid hands-on knowledge of containerization technologies and tools such as Docker, Kubernetes, Cloud Foundry • In-depth understanding of various development and deployment architectures from a DevOps perspective
• Expertise in Grounds-up DevOps projects involving multiple agile teams spread across geographies.
• Experience in a various Agile Project Management software /techniques / tools
• Strong analytical and problem solving skills
• Excellent written and oral communication skills
• Enjoys working as part of agile software teams in a startup environment.

Who Should Apply?

• You have independently managed end-to-end DevOps projects, including understanding requirements, design solutions and implementing, setting up best practices with different business domain over last 2 years.
• You are well versed with Agile development methodologies and have successfully implemented them across at least 2-3 projects
• You have lead development team of 5 to 8 developers with Technology responsibility
• You have served as “Single Point of Contact” for managing technical escalations and decisions
Read more
Druva Software

at Druva Software

1 video
4 recruiters
Manjiri Milind Agashe
Posted by Manjiri Milind Agashe
Pune
5 - 10 yrs
₹10L - ₹25L / yr
DevOps
skill iconPython
AWS CloudFormation
skill iconKubernetes
Terraform
+1 more
Profile - Sr. DevOps Engineer Experience: 5-8 Years The DevOps team at Druva is chartered with developing infrastructure code that is foundational in deployment and operations of Druva's Saas service. Devops team additionally enables Druva engineers to rapidly innovate by building tools that provide a simple, fast and robust developer experience by simulating a cloud in a box. Our focus centers on creating tooling that streamlines development, testing, building, integration, packaging, and deployment of mutable and immutable artifacts. DevOps engineers are involved in the full life cycle of the application. You will be responsible for the design and implementation of the application’ build, release, deployment and configuration activities as well as contribute to defining the deployment architecture of Druva's saas service. You will automate and streamline our operations and processes involved in those activities. You will leverage existing tools and technologies, preferably the open source ones, to build infrastructure applications needed to support deployment, operation and monitoring of Druva's saas service. At the same time, you won't limit yourself from building such tools whenever off-the-shelf tools aren't adequate. You will continuously focus on improving the deployment design and troubleshoot and resolve issues in our dev, test and production environments. Qualifications - 5-8 years experience in designing and developing large scale infrastructure applications that help deploy and smoothly operate a SAAS service. - Experience with wide variety of open source tools and technologies relevant to deployment on a cloud, including deployment frameworks like docker swarm and containers, Kubernetes or equivalent, is a must. - Experience with configuration management using Salt, Puppet, Chef or equivalent - Experience working with AWS is an added advantage. - Strong expertise with bash scripting, python or equivalent. - Strong grasp of automation tools and ability to develop them as needed. - Experience with continuous integration and continuous deployment (CI/CD) and associated automation
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort