DevOps Engineer

at Wigzo Technologies by Shiprocket

Posted by Abhishek Saxena
Delhi, Gurugram
2 - 7 yrs
₹6L - ₹15L / yr
Full time
Amazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)

We are looking for a DevOps Engineer who can deliver high-value features in short periods of time through cross-team collaboration. Capable to bring a collaborative approach to software development, testing, and deployment. To puts the team with varying objectives together to work toward more efficient and high-quality code releases.

Job Location: Sultanpur, New Delhi, 110030.


  • Setup and maintain DevOps tools, Cloud monitoring tools, Cloud security.
  • The DevOps engineer needs to be agile enough to wear a technical hat and manage operations simultaneously.
  • Monitor and maintain highly available systems on Kubernetes (multiple production applications).
  • Implement and manage CI/CD pipelines.
  • Implement an auto-scaling system for our Kubernetes nodes. 
  • Monitoring and maintaining highly available databases (Redis, MongoDB, Postgres, and Cassandra).
  • Monitor cost fluctuations and optimize.
  • Provide support to developers by assigning bugs and alerting them about failures in time.
  • Analyse architecture problems and caveats and provide precise solutions/tools for them.
  • Incident response to system alerts during local day hours.
  • System security and admin credential administration. 
  • Deploy and Manage GCP services (EC2, S3, VPC, Route53, Autoscaling, etc) Configuration management of all services stack.

The DevOps Engineer must have the following skills:

  • Core Skill Set: DevSecOps, Cloud Native Deployments, Deployments using Docker / Kubernetes with supporting non-functioning components (i.e. API gateways, SSO / IAM, Logging / Monitoring, Load Balancers, Firewalls, etc), Deployments in On-Premise environments using modern approaches that are cloud portable.
  • Minimum 2+ years of relevant experience primarily in DevOps and cloud computing.
  • Prior experience working with AWS (EKS, Lambda) or other cloud platforms like GCP,
  • and Azure.
  • Intermediate knowledge of containers and Docker and orchestration.
  • Hands-on experience with Kubernetes.
  • Experience with CI/CD platforms like GitHub Actions, Jenkins, Travi, etc.
  • Experienced in logging and monitoring of cloud resources with EFK, Prometheus, and Grafana.
  • Good command over fundamental OS(Linux) and networking skills.

Learn about our Culture:

Wigzo is a culture-driven company powered by its employees, their vision, and their inspiration. All the employees live by the culture and values that define us. We value people for their talent, personality, competency, and ability to learn and grow.

We create a work environment that allows people to thrive and show their best performance. We believe in meritocracy. We take pride in our diversity and strive to embrace diverse voices and create an inclusive workplace.

To know more please visit: 

About Wigzo Technologies by Shiprocket:

Wigzo is an e-commerce marketing automation platform in which Shiprocket has acquired a majority stake. Together, we help businesses of all sizes delve deeper into data to unleash possibilities to enhance sales and income. Wigzo enables e-commerce firms to personalize each customer interaction, resulting in increased engagement, retention, loyalty, and lifetime value.

An Omnichannel marketing automation suite, Wigzo enables you to understand your customers/visitors more intelligently so you market to them what they want and not what you have. It works on real-time customer insights with real-time communication and a personalization engine that helps marketers manage basic communication and also provides dynamic email, personalized notifications, user retention, real-time engagement, real-time content, and much more.

Wigzo is holding 1000+ customers globally, it’s been 6+ years in the industry, with 100+ e-commerce brands working with us which results in 15 times business growth. For more information please visit our website:

About Wigzo Technologies by Shiprocket

Wigzo is an AI-driven multi-channel marketing automation platform for growing e-commerce brands, helps you acquire, engage, retain, and improve customer lifetime value at scale.
100-500 employees
Raised funding
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Matches delivered
Network size
Companies hiring

Similar jobs

DevOps Engineer (Azure DevOps)

at SkyPoint Cloud

Founded 2019  •  Product  •  20-100 employees  •  Profitable
Windows Azure
Microsoft Windows Azure
Windows PowerShell
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹23L / yr

Devops Engineer

Who we are:
Our mission is to build & drive with excellence; a C (Customer) D(Data) P(Platform) that champions privacy and compliance through connected privacy-first customer experiences. At SkyPoint Cloud, we bring people & data together with excellence towards data protection. We sync up your valuable data, on a single platform. As we watch more and more companies leverage our technology to become truly data-driven, we’re proud not only of our growth but also of the fact that we’ve grown without compromising our core values.
We follow a flexible culture founded on awareness, trust, collaboration, ethics, a strong outlook towards commitment, & customer fascination which are the building pillars of SkyPoint Cloud.
We believe in practicing the Ideal Behaviour at SkyPoint Cloud: Treat Human Asset Fair, Fun work environment, 4 E's (Embrace, Engage, Encourage & Empower), Open Communication, Curiosity & Passion.
SkyPoint is an industry-leading customer data platform, zero trust data vault, data privacy and insights solution for consumer and healthcare brands.
Our platform enables organizations to take control of their customer data, deliver unmatched customer experiences and build brand loyalty.
Industry leaders and over 6 million end-users currently use SkyPoint.
SkyPoint is an Artificial Intelligence (AI) driven Customer Data Platform (CDP). We build SaaS products to deliver personal relationships with your customers in: Healthcare and eCommerce industries.
Employees are driven to make an impact, offer a unique value, and most importantly, be part of a winning team. You have a great passion for customer service, and you are driven by your energy to identify and resolve complex problems and enjoy helping others. If you are eager to prove your strong problem-solving abilities in a fast-paced environment, and you are willing to learn, then this position offers numerous growth opportunities and a long-lasting career with SkyPoint. We are looking for top-notch, motivated engineers to come to join our growing team at SkyPoint.
Who we want:
SkyPoint is looking for ambitious, independent engineers who want to have a big impact at a fast-growing company. You will work on our core data pipeline and the integrations that bring in data from many sources we support. We are looking for people who can understand the key values that make our product great and implement those values in the many small decisions you make every day as a developer.
 Primary Duties & Responsibilities:
  • Implement tooling and process to manage the migration of all systems changes (code & configuration) through various environments (Dev, Test, UAT, PROD, etc.) of the Skypoint platform.
  • Infrastructure as Code experience: ARM (Azure Resource Manager), Terraform, or Pulumi.
  • Having Azure DevOps experience: practical experience building Build/Release pipelines, CI/CD, and integration automation test suites in CI pipelines.
  • Git experience: practical experience with Git workflows (commits, branches, pull requests).
  • Networking knowledge: appreciation of Azure VNets/Private Links/Front Door.
  • Strong affinity with software development (we are not looking for a software developer, but someone who understands the domain of software development).
  • Troubleshoot, reproduce and solve challenging operational issues in a complex cloud environment, involving our load balancing platforms, and interacting with multiple microservices across our infrastructure.
  • Implement tooling and process to manage regular data backups, logs processing, and access control.
  • Manage ongoing maintenance activities such as certificate renewals, outage communications, and sandbox environment refreshes.
  • Develop tools and procedures to support security and access control automation (provisioning & controls) in Microsoft Azure environments.
  • Implement tooling and process to automate infrastructure setup and management across all our platforms.
Skills & Experience Required:
  • A bachelor’s or master’s degree in computer science or software systems with 5+ years of relevant experience.
  • Overall 5+ years of experience with 3+ relevant exp as a DevOps Engineer.
  •  Min 2+ years of experience in Azure DevOps, Azure Pipelines, APIM ADF/ Azure Databricks.
  • Industry-level certifications such as CISSP, Microsoft offers AZ-400:Microsoft Azure DevOps.
  • A passion for automation of all aspects of software development, DevOps tools, and maintenance.
  • Extensive experience in a DevOps team, supporting CI/CD workloads, configuration management software with tools like Ansible, Puppet, Chef, Jenkins, Docker, Azure Kubernetes, etc.
  • At least 3-5 years of working experience in designing and implementing automated solutions to enable the management, and administration of Microsoft cloud infrastructure (Azure) with expert knowledge of Microsoft Azure technologies.
  • Experience supporting the following technology stack and services (Virtual Machines, Kubernetes Services (AKS), Container Instances, Terraform, Ansible, Docker, HAProxy, Nginx, ELB/ALB, ELK, Grafana, ECS/EKS/Kubernetes, Fluentd, Elasticsearch) is a plus.
  • Programming/scripting experience with Python, C#, shell scripting or Bash is a must.
  • Experience with some aspect(s) of computer security: network security, application security, security protocols, cryptography, etc is a big plus.
  • Experience with automation of log pooling, rotation, scrubbing & analysis is a plus.
  • Experience with the use of service monitoring tools such as statsd, collected, ELK stack, or similar 3rd party monitoring services is a plus.
  • Strong verbal and written communication skills with the ability to work effectively on shared projects with program managers, developers, and testers.
 Perks of working with us:
  • Professional development and training opportunities.
  • Company happy hours and fun team building activities.
  • Flexi work hours plus enjoy the benefit of having your workstation at your home.
  • Add-On Internet reimbursement within the company's permissible limits.
  • Opportunity to work with US-based SaaS start-up working on new tech stacks.
  • Meal cards and gift hampers
  • Competitive total compensation package (Salary Bonus + Equity)
Job posted by
Pooja Singh

Devops Developer

at IT Service/Product Company

Agency job
via OfficeDay Innovation
Amazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
2 - 5 yrs
₹3L - ₹10L / yr


Have good hands-on experience on Dev Ops, AWS Admin, terraform, Infrastructure as a Code
Have knowledge of EC2, Lambda, S3, ELB, VPC, IAM, Cloud Watch, CentOS, Server Hardening
Ability to understand business requirements and translate them into technical requirements
A knack for benchmarking and optimization

Job posted by

DevOps Engineer

at Dhwani Rural Information Systems

Founded  •   •  employees  • 
Amazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
AWS CloudFormation
Amazon EC2
Amazon S3
2 - 6 yrs
₹4L - ₹10L / yr
Job Overview
We are looking for an excellent experienced person in the Dev-Ops field. Be a part of a vibrant, rapidly growing tech enterprise with a great working environment. As a DevOps Engineer, you will be responsible for managing and building upon the infrastructure that supports our data intelligence platform. You'll also be involved in building tools and establishing processes to empower developers to
deploy and release their code seamlessly.

The ideal DevOps Engineers possess a solid understanding of system internals and distributed systems.
Understanding accessibility and security compliance (Depending on the specific project)
User authentication and authorization between multiple systems,
servers, and environments
Integration of multiple data sources and databases into one system
Understanding fundamental design principles behind a scalable
Configuration management tools (Ansible/Chef/Puppet), Cloud
Service Providers (AWS/DigitalOcean), Docker+Kubernetes ecosystem is a plus.
Should be able to make key decisions for our infrastructure,
networking and security.
Manipulation of shell scripts during migration and DB connection.
Monitor Production Server Health of different parameters (CPU Load, Physical Memory, Swap Memory and Setup Monitoring tool to
Monitor Production Servers Health, Nagios
Created Alerts and configured monitoring of specified metrics to
manage their cloud infrastructure efficiently.
Setup/Managing VPC, Subnets; make connection between different zones; blocking suspicious ip/subnet via ACL.
Creating/Managing AMI/Snapshots/Volumes, Upgrade/downgrade
AWS resources (CPU, Memory, EBS)
 The candidate would be Responsible for managing microservices at scale maintain the compute and storage infrastructure for various product teams.

Strong Knowledge about Configuration Management Tools like –
Ansible, Chef, Puppet
Extensively worked with Change tracking tools like JIRA and log
Analysis, Maintaining documents of production server error log's
Experienced in Troubleshooting, Backup, and Recovery
Excellent Knowledge of Cloud Service Providers like – AWS, Digital
Good Knowledge about Docker, Kubernetes eco-system.
Proficient understanding of code versioning tools, such as Git
Must have experience working in an automated environment.
Good knowledge of Amazon Web Service Architects like – Amazon EC2, Amazon S3 (Amazon Glacier), Amazon VPC, Amazon Cloud Watch.
Scheduling jobs using crontab, Create SWAP Memory
Proficient Knowledge about Access Management (IAM)
Must have expertise in Maven, Jenkins, Chef, SVN, GitHub, Tomcat, Linux, etc.
Candidate Should have good knowledge about GCP.

B-Tech-IT/M-Tech -/MBA- IT/ BCA /MCA or any degree in the relevant field
Job posted by
Sunandan Madan

Technical Lead - DevOps

at Srijan Technologies

Founded 2002  •  Products & Services  •  100-1000 employees  •  Profitable
Amazon Web Services (AWS)
Windows Azure
Remote only
4 - 12 yrs
₹18L - ₹25L / yr

Srijan Technologies is hiring for the DevOps Lead position- Cloud Team with a permanent WFH option.

Immediate Joiners or candidates with 30 days notice period are preferred.


  • Minimum 4-6 Years experience in DevOps Release Engineering. 
  • Expert-level knowledge of Git. 
  • Must have great command over Kubernetes
  • Certified Kubernetes Administrator
  • Expert-level knowledge of Shell Scripting & Jenkins so as to maintain continuous integration/deployment infrastructure. 
  • Expert level of knowledge in Docker. 
  • Expert level of Knowledge in configuration management and provisioning toolchain; At least one of Ansible / Chef / Puppet. 
  • Basic level of web development experience and setup: Apache, Nginx, MySQL 
  • Basic level of familiarity with Agile/Scrum process and JIRA. 
  • Expert level of Knowledge in AWS Cloud Services.
Job posted by
Adyasha Satpathy

DevOps Engineer

at Innovatily

Founded 2014  •  Services  •  20-100 employees  •  Profitable
Amazon Web Services (AWS)
Bengaluru (Bangalore)
4 - 10 yrs
₹10L - ₹20L / yr

Experience and Education
• Bachelor’s degree in engineering or equivalent.

Work experience
• 4+ years of infrastructure and operations management

Experience at a global scale.
• 4+ years of experience in operations management, including monitoring, configuration management, automation, backup, and recovery.

• Broad experience in the data center, networking, storage, server, Linux, and cloud technologies.

• Broad knowledge of release engineering: build, integration, deployment, and provisioning, including familiarity with different upgrade models.

• Demonstratable experience with executing, or being involved of, a complete end-to-end project lifecycle.

• Excellent communication and teamwork skills – both oral and written.

• Skilled at collaborating effectively with both Operations and Engineering teams.

• Process and documentation oriented.

• Attention to details. Excellent problem-solving skills.

• Ability to simplify complex situations and lead calmly through periods of crisis.

• Experience implementing and optimizing operational processes.

• Ability to lead small teams: provide technical direction, prioritize tasks to achieve goals, identify dependencies, report on progress.

Technical Skills
• Strong fluency in Linux environments is a must.

• Good SQL skills.
• Demonstratable scripting/programming skills (bash, python, ruby, or go) and the ability to develop custom tool integrations between multiple systems using their published API’s / CLI’s.

• L3, load balancer, routing, and VPN configuration.

• Kubernetes configuration and management.

• Expertise using version control systems such as Git.

• Configuration and maintenance of database technologies such as Cassandra, MariaDB, Elastic.

• Designing and configuration of open-source monitoring systems such as Nagios, Grafana, or Prometheus.

• Designing and configuration of log pipeline technologies such as ELK (Elastic Search Logstash Kibana), FluentD, GROK, rsyslog, Google Stackdriver.

• Using and writing modules for Infrastructure as Code tools such as Ansible, Terraform, helm, customize.

• Strong understanding of virtualization and containerization technologies such as VMware, Docker, and Kubernetes.

• Specific experience with Google Cloud Platform or Amazon EC2 deployments and virtual machines.c

Job posted by
Jayasri S

Devops Engineer

at Certa

Founded 2018  •  Products & Services  •  100-1000 employees  •  Raised funding
Amazon Web Services (AWS)
Go Programming (Golang)
Shell Scripting
Design thinking
Infrastructure architecture
Remote only
1 - 5 yrs
₹8L - ₹12L / yr

Certa ( is a Silicon Valley-based tech product start-up that is automating the vendor, supplier, and other stakeholder onboarding processes (think background checks, agreements, and the works) for companies across industries and geographies. With multiple Fortune-500 and Fortune-1000 clients, at Certa's tech team, you will be working on stuff that is changing the way huge companies do business.


The DevOps engineers will work within an agile team of Engineers and Operations personnel building highly resilient, scalable and performant AWS infrastructure in an automated and efficient manner. The DevOps engineers will work alongside the Application DevOps teams and cross-functional IT teams. The engineers will be required to use their initiative to innovate to achieve maximum performance and be prepared to investigate and use new products/services offered by AWS.

Key Accountabilities

  1. Build and manage the AWS foundation platform to enable application deployments
  2. Monitor Infra Performance, build monitors and alerts.
  3. Engineer solutions on AWS foundation platform using Infrastructure As Code methods (e.g. Terraform)
  4. Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers)
  5. Ensure compliance with centrally defined Security Standards
  6. Ensure compliancy with Operational risk standards (E.g. Network, Firewall, OS, Logging, Monitoring, Availability, Resiliency)
  7. Build and support continuous integration (CI), continuous delivery (CD) and continuous testing activities
  8. Engineering activities to implement patches provided centrally
  9. Update support and operational documentation as required

Qualifications And Experience

• 1 > year experience in working as a Devops.

• Experience of building a range of Services in AWS including EC2, S3, VPC, Lambda, RDS, Fargate/ECS, Aurora Serverless

• Expert understanding of DevOps principles and Infrastructure as a Code concepts and techniques

• Strong understanding of CI/CD and available tools

• Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools

• Customer/stakeholder focus. Ability to build strong relationships with Application teams, cross functional IT and global/local IT teams

• Good leadership and teamwork skills - Works collaboratively in an agile environment with DevOps application ‘pods’ to provide AWS specific capability/skills required to deliver the service.

• Operational effectiveness - delivers solutions that align to approved design patterns and security standards

• Risk management effectiveness

• Excellent skills in at least one of following: Python, GO etc.

• Experienced in full automation and configuration management

• A successful track record of delivering complex projects and/or programs, utilizing appropriate techniques and tools to ensure and measure success

• A comprehensive understanding of risk management and proven experience of ensuring own/others’ compliance with relevant regulatory processes

Job posted by
Sanyam Bansal


at W3 Tech Solutions

Founded 2017  •  Products & Services  •  0-20 employees  •  Profitable
Amazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
Remote, Mumbai, Nala Sopara
1 - 3 yrs
₹2L - ₹6L / yr
Flutter Developer with minimum 1 year Experience
Knowledge of PHP will be preferred
May get chance to handle a team 
Job posted by
Karunesh Mishra

Senior DevOps Engineer

at Hypersonix Inc

Founded 2018  •  Product  •  100-500 employees  •  Profitable
Amazon Web Services (AWS)
Amazon EC2
Amazon Redshift
Elastic Search
Google Cloud Platform (GCP)
Remote, Bengaluru (Bangalore)
6 - 10 yrs
₹15L - ₹45L / yr
Roles & Responsibilities
  • Manage systems on AWS infrastructure including application servers, database servers
  • Proficiency with EC2, Redshift, RDS, Elasticsearch, MongoDB and other AWS services.
  • Proficiency with managing a distributed service architecture with multiple microservices - including maintaining dev, QA, staging and production environments, managing zero-downtime releases, ensuring failure rollbacks with zero-downtime and scaling on-demand
  • Containerization of workloads and rapid deployment
  • Driving cost optimization while balancing performance
  • Manage high availability of existing systems and proactively address system maintenance issues
  • Manage AWS infrastructure configurations along with Application Load Balancers, HTTPS configurations, and network configurations (VPCs)
  • Work with the software engineering team to automate code deployment
  • Build and maintain tools for deployment, monitoring and operations. And troubleshoot and resolve issues in our dev, test and production environments.
  • Familiarity with managing Spark based ETL pipelines a plus
  • Experience in managing a team of DevOps Engineers
Required Qualifications
  • Bachelor's or Master's degree in a quantitative field
  • Cloud computing experience, Amazon Web Services (AWS). Bonus if you've worked on Azure, GCP and on cost optimization.
  • Prior experience in working on a distributed microservices architecture and containerization.
  • Strong background in Linux/Windows administration and scripting
  • Experience with CI/CD pipelines, Git, deployment configuration and monitoring tools
  • Working understanding of various components for Web Architecture
  • A working understanding of code and script (Javascript, Angular, Python)
  • Excellent communication skills, problem-solving and troubleshooting.
Job posted by
Manu Panwar

DevOps Engineer

at IVentors Initiatives

Founded 2019  •  Product  •  20-100 employees  •  Raised funding
Amazon Web Services (AWS)
MERN Stack
NodeJS (Node.js)
Remote only
1 - 2 yrs
₹15,000 - ₹25,000 / mo
Selected intern's day-to-day responsibilities include:

1. Developing a video player website where students can learn various courses, view e-books, solve tests, etc.
2. Building the product to reach higher scalability
3. Developing software to integrate with internal back-end systems
4. Working on AWS cloud platform
5. Working on Amazon Ec2, Amazon S3 bucket, and Git
6. Working on the implementation of continuous integration and deployment pipelines using Jenkins (mandatory)
7. Monitoring, troubleshooting, and diagnosing infrastructure systems (excellent knowledge required for the same)
8. Building tools to reduce the occurrences of errors and improve customer experience
9. Should have experience in MERN Stack too.
Job posted by
Tarun Bharadwaj

DevOps Solution Architect

at Tetrasoft inc

Founded 1997  •  Products & Services  •  100-1000 employees  •  Profitable
Object Oriented Programming (OOPs)
Amazon Web Services (AWS)
Microsoft Windows Azure
8 - 12 yrs
Best in industry
Tetrasoft is hiirng DevOps Architect!
Below is the Job details:
Role: DevOps Architect
Experience Level: 8-12 Years
Job Location: Hyderabad

Key Responsibilities :

Look through the various DevOps Tools/Technologies and identify the strengths and provide direction to the DevOps automation team

Out-of-box thought process on the DevOps Automation Platform implementation

Expose various tools and technologies and do POC on integration of the these tools

Evaluate Backend API's for various DevOps tools

Perform code reviews keep in context of RASUI

Mentor the team on the various E2E integrations

Be Liaison in evangelizing the automation solution currently implemented

Bring in various DevOps best Practices/Principles and participate in adoption with various app teams

Must have:

Should possess Bachelors/Masters in computer science with minimum of 8+ years of experience

Should possess minimum 3 years of strong experience in DevOps

Should possess expertise in using various DevOps tools libraries and API's (Jenkins/JIRA/AWX/Nexus/GitHub/BitBucket/SonarQube)

Should possess expertise in optimizing the DevOps stack ( Containers/Kubernetes/Monitoring )

2+ Experience in creating solutions and translate to the development team

Should have strong understanding of OOPs, SDLC (Agile Safe standards)

Proficient in Python , with a good knowledge of its ecosystems (IDEs and Frameworks)

Proficient in various cloud platforms (Azure/AWS/Google cloud platform)

Proficient in various DevOps offerings (Pivotal/OpenStack/Azure DevOps

Talent acquisition team
Tetrasoft India
Stay home and Stay safe
Job posted by
Chandrika Yaminedu
Did not find a job you were looking for?
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Wigzo Technologies by Shiprocket?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort