
DevOps Engineer
- Job Title:- Backend/DevOps Engineer
- Job Location:- Opp. Sola over bridge, Ahmedabad
- Education:- B.E./ B. Tech./ M.E./ M. Tech/ MCA
- Number of Vacancy:- 03
- 5 Days working
- Notice Period:- Can join less than a month
- Job Timing:- 10am to 7:30pm.
About the Role
Are you a server-side developer with a keen interest in reliable solutions?
Is Python your language?
Do you want a challenging role that goes beyond backend development and includes infrastructure and operations problems?
If you answered yes to all of the above, you should join our fast growing team!
We are looking for 3 experienced Backend/DevOps Engineers who will focus on backend development in Python and will be working on reliability, efficiency and scalability of our systems. As a member of our small team you will have a lot of independence and responsibilities.
As Backend/DevOps Engineer you will...:-
- Design and maintain systems that are robust, flexible and preformat
- Be responsible for building complex and take high- scale systems
- Prototype new gameplay ideas and concepts
- Develop server tools for game features and live operations
- Be one of three backend engineers on our small and fast moving team
- Work alongside our C++, Android, and iOS developers
- Contribute to ideas and design for new features
To be successful in this role, we'd expect you to…:-
- Have 3+ years of experience in Python development
- Be familiar with common database access patterns
- Have experience with designing systems and monitoring metrics, looking at graphs.
- Have knowledge of AWS, Kubernetes and Docker.
- Be able to work well in a remote development environment.
- Be able to communicate in English at a native speaking and writing level.
- Be responsible to your fellow remote team members.
- Be highly communicative and go out of your way to contribute to the team and help others

Similar jobs
Role: Senior Platform Engineer (GCP Cloud)
Experience Level: 3 to 6 Years
Work location: Mumbai
Mode : Hybrid
Role & Responsibilities:
- Build automation software for cloud platforms and applications
- Drive Infrastructure as Code (IaC) adoption
- Design self-service, self-healing monitoring and alerting tools
- Automate CI/CD pipelines (Git, Jenkins, SonarQube, Docker)
- Build Kubernetes container platforms
- Introduce new cloud technologies for business innovation
Requirements:
- Hands-on experience with GCP Cloud
- Knowledge of cloud services (compute, storage, network, messaging)
- IaC tools experience (Terraform/CloudFormation)
- SQL & NoSQL databases (Postgres, Cassandra)
- Automation tools (Puppet/Chef/Ansible)
- Strong Linux administration skills
- Programming: Bash/Python/Java/Scala
- CI/CD pipeline expertise (Jenkins, Git, Maven)
- Multi-region deployment experience
- Agile/Scrum/DevOps methodology

JOB DETAILS:
* Job Title: Specialist I - DevOps Engineering
* Industry: Global Digital Transformation Solutions Provider
* Salary: Best in Industry
* Experience: 7-10 years
* Location: Bengaluru (Bangalore), Chennai, Hyderabad, Kochi (Cochin), Noida, Pune, Thiruvananthapuram
Job Description
Job Summary:
As a DevOps Engineer focused on Perforce to GitHub migration, you will be responsible for executing seamless and large-scale source control migrations. You must be proficient with GitHub Enterprise and Perforce, possess strong scripting skills (Python/Shell), and have a deep understanding of version control concepts.
The ideal candidate is a self-starter, a problem-solver, and thrives on challenges while ensuring smooth transitions with minimal disruption to development workflows.
Key Responsibilities:
- Analyze and prepare Perforce repositories — clean workspaces, merge streams, and remove unnecessary files.
- Handle large files efficiently using Git Large File Storage (LFS) for files exceeding GitHub’s 100MB size limit.
- Use git-p4 fusion (Python-based tool) to clone and migrate Perforce repositories incrementally, ensuring data integrity.
- Define migration scope — determine how much history to migrate and plan the repository structure.
- Manage branch renaming and repository organization for optimized post-migration workflows.
- Collaborate with development teams to determine migration points and finalize migration strategies.
- Troubleshoot issues related to file sizes, Python compatibility, network connectivity, or permissions during migration.
Required Qualifications:
- Strong knowledge of Git/GitHub and preferably Perforce (Helix Core) — understanding of differences, workflows, and integrations.
- Hands-on experience with P4-Fusion.
- Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
- Proficiency in migration tools such as git-p4 fusion — installation, configuration, and troubleshooting.
- Ability to identify and manage large files using Git LFS to meet GitHub repository size limits.
- Strong scripting skills in Python and Shell for automating migration and restructuring tasks.
- Experience in planning and executing source control migrations — defining scope, branch mapping, history retention, and permission translation.
- Familiarity with CI/CD pipeline integration to validate workflows post-migration.
- Understanding of source code management (SCM) best practices, including version history and repository organization in GitHub.
- Excellent communication and collaboration skills for cross-team coordination and migration planning.
- Proven practical experience in repository migration, large file management, and history preservation during Perforce to GitHub transitions.
Skills: Github, Kubernetes, Perforce, Perforce (Helix Core), Devops Tools
Must-Haves
Git/GitHub (advanced), Perforce (Helix Core) (advanced), Python/Shell scripting (strong), P4-Fusion (hands-on experience), Git LFS (proficient)
Implementing various development, testing, automation tools, and IT infrastructure
Selecting and deploying appropriate CI/CD tools
Required Candidate profile
LinuxWorking knowledge of any webserver eg- NGINX or Apache
We are looking for a tech enthusiast to work in challenging environment, we are looking for a person who is self driven, proactive and has good experience in Azure, DevOps, Asp.Net etc. share your resume today if this interests you.
Job Location: Pune
About us:
JetSynthesys is a leading gaming and entertainment company with a wide portfolio of world class products, platforms, and services. The company has a robust foothold in the cricket community globally with its exclusive JV with Sachin Tendulkar for the popular Sachin Saga game and a 100% ownership of Nautilus Mobile - the developer of India’s largest cricket simulation game Real Cricket, becoming the #1 cricket gaming franchise in the world. Standing atop in the charts of organizations fueling Indian esports gaming industry, Jetsysthesys was the earliest entrant in the e-sports industry with a founding 50% stake in India’s largest esports company, Nodwin Gaming that is recently funded by popular South Korean gaming firm Krafton.
Recently, the company has developed WWE Racing Showdown, a high-octane vehicular combat game borne out of a strategic partnership with WWE. Adding to the list is the newly launched board game - Ludo Zenith, a completely reimagined ludo experience for gamers, built in partnership with Square Enix - a Japanese gaming giant.
JetSynthesys Pvt. Ltd. is proud to be backed by Mr. Adar Poonawalla - Indian business tycoon and CEO of Serum Institute of India, Mr. Kris Gopalakrishnan – Co-founder of Infosys and the family offices of Jetline Group of Companies. JetSynthesys’ partnerships with large gaming companies in the US, Europe and Japan give it an opportunity to build great products not only for India but also for the world.
Responsibilities
- As a Security & Azure DevOps engineer technical specialist you will be responsible for advising and assisting in the architecture, design and implementation of secure infrastructure solutions
- Should be capable of technical deep dives into infrastructure, databases, and application, specifically in operating, and supporting high-performance, highly available services and infrastructure
· Deep understanding of cloud computing technologies across Windows, with demonstrated hands-on experience on the following domains:
· Experience in building, deploying and monitoring Azure services with strong IaaS and PaaS services (Redis Cache, Service Bus, Event Hub, Cloud Service etc.)
· Understanding of API endpoint management
· Able to monitor, maintain serverless architect using function /web apps
· Logs analysis capabilities using elastic search and Kibana dashboard
· Azure Core Platform: Compute, Storage, Networking
· Data Platform: SQL, Cosmo DB, MongoDB and JQL query
· Identity and Authentication: SSO Federation, ADAzure AD etc
· Experience with Azure Storage, Backup and Express Route
· Hands-on experience in ARM templates
· Ability to write PowerShell & Python scripts to automate IT Operations
· Working Knowledge of Azure OMS and Configuration of OMS Dashboards is desired
· VSTS Deployments
· You will help assist in stabilizing developed solutions by understanding the relevant application development, infrastructure and operations implications of the developed solution.
· Use of DevOps tools to deliver and operate end-user services a plus (e.g., Chef, New Relic, Puppet, etc.)
· Able to deploy and re-create of resources
· Building terraforms for cloud infrastructure
· Having ITIL/ITASM standards as best practise
· PEN testing and OWASP security testing will be addon bonus
· Load and performance testing using Jmeter
· Capacity review on daily basis
· Handing repeated issues and solving them using ansible automation
· Jenkin pipelines for CI/CD
· Code review platform using sonarqube
· Cost analysis on regular basis to keep the system and resources optimum
· Experience on ASP.Net, C#, .Net programming is must.
Qualifications:
Minimum graduate (Preferred Stream: IT/Technical)
Job Description
Please connect me on Linkedin or share your Resume on shrashti jain
• 8+ years of overall experience and relevant of at least 4+ years. (Devops experience has be more when compared to the overall experience)
• Experience with Kubernetes and other container management solutions
• Should have hands on and good understanding on DevOps tools and automation framework
• Demonstrated hands-on experience with DevOps techniques building continuous integration solutions using Jenkins, Docker, Git, Maven
• Experience with n-tier web application development and experience in J2EE / .Net based frameworks
• Look for ways to improve: Security, Reliability, Diagnostics, and costs
• Knowledge of security, networking, DNS, firewalls, WAF etc
• Familiarity with Helm, Terraform for provisioning GKE,Bash/shell scripting
• Must be proficient in one or more scripting languages: Unix Shell, Perl, Python
• Knowledge and experience with Linux OS
• Should have working experience with monitoring tools like DataDog, Elk, and/or SPLUNK, or any other monitoring tools/processes
• Experience working in Agile environments
• Ability to handle multiple competing priorities in a fast-paced environment
• Strong Automation and Problem-solving skills and ability
• Experience of implementing and supporting AWS based instances and services (e.g. EC2, S3, EBS, ELB, RDS, IAM, Route53, Cloudfront, Elasticache).
•Very strong hands with Automation tools such Terraform
• Good experience with provisioning tools such as Ansible, Chef
• Experience with CI CD tools such as Jenkins.
•Experience managing production.
• Good understanding of security in IT and the cloud
• Good knowledge of TCP/IP
• Good Experience with Linux, networking and generic system operations tools
• Experience with Clojure and/or the JVM
• Understanding of security concepts
• Familiarity with blockchain technology, in particular Tendermint
DevOps Engineer
Notice Period: 45 days / Immediate Joining
Banyan Data Services (BDS) is a US-based Infrastructure services Company, headquartered in San Jose, California, USA. It provides full-stack managed services to support business applications and data infrastructure. We do provide the data solutions and services on bare metal, On-prem, and all Cloud platforms. Our engagement service is built on the DevOps standard practice and SRE model.
We are looking for a DevOps Engineer to help us build functional systems that improve customer experience. we offer you an opportunity to join our rocket ship startup, run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer, that address next-gen data evolution challenges. Candidates who are willing to use their experience in areas directly related to Infrastructure Services, Software as Service, and Cloud Services and create a niche in the market.
Key Qualifications
· 4+ years of experience as a DevOps Engineer with monitoring, troubleshooting, and diagnosing infrastructure systems.
· Experience in implementation of continuous integration and deployment pipelines using Jenkins, JIRA, JFrog, etc
· Strong experience in Linux/Unix administration.
· Experience with automation/configuration management using Puppet, Chef, Ansible, Terraform, or other similar tools.
· Expertise in multiple coding and scripting languages including Shell, Python, and Perl
· Hands-on experience Exposure to modern IT infrastructure (eg. Docker swarm/Mesos/Kubernetes/Openstack)
· Exposure to any of relation database technologies MySQL/Postgres/Oracle or any No-SQL database
· Worked on open-source tools for logging, monitoring, search engine, caching, etc.
· Professional Certificates in AWS or any other cloud is preferable
· Excellent problem solving and troubleshooting skills
· Must have good written and verbal communication skills
Key Responsibilities
Ambitious individuals who can work under their own direction towards agreed targets/goals.
Must be flexible to work on the office timings to accommodate the multi-national client timings.
Will be involved in solution designing from the conceptual stages through development cycle and deployments.
Involve development operations & support internal teams
Improve infrastructure uptime, performance, resilience, reliability through automation
Willing to learn new technologies and work on research-orientated projects
Proven interpersonal skills while contributing to team effort by accomplishing related results as needed.
Scope and deliver solutions with the ability to design solutions independently based on high-level architecture.
Independent thinking, ability to work in a fast-paced environment with creativity and brainstorming
http://www.banyandata.com" target="_blank">www.banyandata.com

- AWS Cloud, CICD, Serverless setups, Monitoring Setup
- Performance setup, scalability in hands experience, Linux expertise, DevOps Operations
- AWS Cloud, CICD, Serverless setups, Monitoring Setup, Performance setup, scalability in hands experience, Linux expertise, DevOps Operations.
- Have 3+ years of experience in Python development
- Be familiar with common database access patterns
- Have experience with designing systems and monitoring metrics, looking at graphs.
- Have knowledge of AWS, Kubernetes and Docker.
- Be able to work well in a remote development environment.
- Be able to communicate in English at a native speaking and writing level.
- Be responsible to your fellow remote team members.
- Be highly communicative and go out of your way to contribute to the team and help others








