Senior DevOps Engineer

at They platform powered by machine learning. (TE1)

Agency job
icon
Bengaluru (Bangalore)
icon
4 - 6 yrs
icon
₹12L - ₹20L / yr
icon
Full time
Skills
Kubernetes
Docker
Amazon Web Services (AWS)
Azure
  • Deploy company Application on customer public cloud and on-premise data centers
  • Building Kubernetes based workflows for wide variety of use cases
  • Document and Automate the deployment process for internal and external deployments
  • Interacting with customers over call to deployment and debugging
  • Deployment and Product Support

 

Desired Skills and Experience

 

  • 4-6 years of experience in infrastructure development, or development and operations.
  • Minimum 2+ years of experience in docker and kubernetes.
  • Experience working with Docker and Kubernetes. Aware of Kubernetes Internals, Networking etc. Experience with Linux infrastructures tools.
  • Good interpersonal skills and communication with all levels of management.
  • Extensive experience in setting up Kubernetes on AWS, Azure etc.

 

Good to Have

  • Familiarity with Big Data Tools like Hadoop, Spark.
  • Experience with Java Application Debugging.
  • Experience in monitoring tools like Prometheus, Grafana etc

 

Read more
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Azure/ Aws Cloud Architect

at For leading digital transformation services provider

Agency job
via HyrHub
Microsoft Windows Azure
Amazon Web Services (AWS)
Windows Azure
Kubernetes
IT infrastructure
Java
.NET
DynamoDB
Cloud Computing
icon
Bengaluru (Bangalore), Chennai, Hyderabad, Mumbai, Delhi
icon
13 - 20 yrs
icon
₹18L - ₹25L / yr
Overall 15 yrs+ of experience in IT and at least 3-5 years work experience in AWS Or Azure cloud in the
capacity of an architect
 Technically hands-on experience on AWS and Azure. Experience in PaaS services like Lambda,
ECS, EKS, Fargate, Elastic bean stalk, Step functions, workflow serivce, SQS, SES, DynamoDB,
RDS, ACS, AKS, Webapp, Logic Apps, SQL Azure, Workflow service, Service Bus, Storage Queues,
MySQL, Azure functions
 Tehcnically hands-on experience in migrating or building enterprise .NET or Java stack
applications on AWS or Azure
 Technically hands-on experience in containerizing customer .NET and/or Java applications and
deploying them on managed container services in AWS or Azure
 Technical hands-on experience in designing and building No SQL databases like MongoDB,
Cassandra
 Experience and knowledge on Kubernetes platform
 Steong understanding of container tehcnology (Docker), conainer architecture, deployment
patterns (single/multicontainer)

 Experience in designing, building micro services based customer applications on Java and/or
.NET stack
 Official Architect certification in AWS or Azure
 Architecture development experience across business, applications, data and technology
domains
 Excellent communication and influencing skills and experience in leading customer
conversations and front-ending with customers
 Experience of leading major strategic business and IT transformation programs
 Ability to explain complex technical issues in a way that non-technical people may understand
 Proven track record of mentoring team in adopting new technologies and ability to manage a
team of architects and developers

Optional skills
 Project and program management planning and organizational skills
 Experience in other cloud technologies like GCP, Vmware Tanzu, Openshift, IBM cloud
 Experience and knowledge in Hybrid cloud solutions and hybrid clod migration and management
Read more
Job posted by
Shwetha Naik
Angular (2+)
TypeScript
Javascript
User Interface (UI) Development
RESTful APIs
Webservices
Linux/Unix
Problem solving
Analytical Skills
Git
Docker
Jenkins
icon
Bengaluru (Bangalore)
icon
5 - 10 yrs
icon
₹10L - ₹15L / yr
❖ Manage development of one or more Bodhee Services.
❖ Performing requirement analyses by interacting with BA, PM, and
Architect.
❖ Developing high-level and detailed designs.
❖ Design, Develop, Test, Implement and Maintain high-volume, lowlatency applications for critical systems and delivering highavailability and performance.
❖ Writing well designed, testable, reusable, efficient code.
❖ Conducting configuration of your own work.
❖ Reviewing the work of other developers and providing feedback.
❖ Mentor and manage the dev team.
❖ Collaboration with testing team for Integration Testing
we will provide upto 30LPA...
Read more
Job posted by
VINOTH KUMAR
DevOps
Bash
Linux/Unix
Python
Java
Amazon Web Services (AWS)
Docker
Git
icon
Chennai
icon
3 - 4 yrs
icon
₹3L - ₹7L / yr
  • 3to 4years of professional experience as a DevOps / System Engineer
  • Command line experience with Linux including writing bash scripts
  • Programming in Python, Java or similar
  • Fluent in Python and Python testing best practices
  • Extensive experience working within AWS and with it’s managed products (EC2, ECS, ECR,R53,SES, Elasticache, RDS,VPCs etc)
  • Strong experience with containers (Docker, Compose, ECS)
  • Version control system experience (e.g. Git)
  • Networking fundamentals
  • Ability to learn and apply new technologies through self-learning

 

 

Responsibilities

 

 

 

  • As part of a team implement DevOps infrastructure projects
  • Design and implement secure automation solutions for development, testing, and productionenvironments
  • Build and deploy automation, monitoring, and analysis solutions
  • Manage our continuous integration and delivery pipeline to maximize efficiency
  • Implement industry best practices for system hardening and configuration management
  • Secure, scale, and manage Linux virtual environments
  • Develop and maintain solutions for operational administration, system/data backup, disasterrecovery, and security/performance monitoring
Continuously evaluate existing systems with industry standards, and make recommendations forimprovement
Read more
Job posted by
VINOTH KUMAR

NOC Lead

at Saas based product company

NOC
Computer Networking
Network operations
Amazon Web Services (AWS)
Windows Azure
Linux/Unix
Kubernetes
Apache Kafka
icon
Bengaluru (Bangalore), Gurugram
icon
4 - 8 yrs
icon
₹6L - ₹15L / yr
Hi

We have an opportunity for a Lead Operations Engineer role with our client at Bangalore /Gurgaon. Sharing the JD for your reference. Please revert if you would be interested in this opportunity and we can connect accordingly.

JOB DETAILS

Shift timing: 

9.00AM-6.00PM / 11.00AM -8.00PM / 2.00PM – 11.00PM / 7.00PM -3.00AM IST(Night shift allowance will be provided)

Position

Lead Operations Engineer

Location

Bangalore/ Gurgaon

About Our client

Who we are :

At a time when consumers are connected and empowered like never before,

Our client is helping the world's largest brands provide amazing experiences at every turn. It offers a set of powerful social capabilities that allow our clients to reach, engage, and listen to customers across 24

social channels. We empower entire organizations to work together across social, marketing, advertising,

research, and customer care to manage customer experience at scale. Most exciting, Our client works with 50% of the Fortune 500 and nine of the world's 10 most valued brands, including McDonald's, Nestle, Nike,

P&G, Shell, Samsung, and Visa.

What You'll Do

What You’ll Do As a Lead Operations Engineer at our client, you should be passionate about working on new technologies, high profile projects, and are motivated to deliver solutions on an aggressive schedule.

Candidates from product based companies only.

1. 5-7 years of exposure and working knowledge of data centers on-premise or on AWS/Azure/GCP.

2. Working Experience on Jenkins, Ansible, Git, Release & Deployments

3. Working Experience on ELK, Mongo, Kafka, Kubernetes.

4. Implement and operate SaaS environments hosting multiple applications and provide production support.

5. Contribute to automation and provisioning of environments.

6. Strong Linux systems administration skills with RHCE/Centos.

7. Have scripting knowledge in one of the following – Python/Bash/Perl.

8. Good knowledge on Gradle, Maven, etc

9. Should have knowledge of service monitoring via Nagios, Sensu, etc

10. Good to have knowledge on setting up and deploying application servers .

11. Mentoring Team members
Read more
Job posted by
Manasa Rao

Senior Data Engineer

at CoStrategix Technologies

Founded 2006  •  Services  •  100-1000 employees  •  Profitable
Data engineering
Data Structures
Programming
Python
C#
Azure Data Factory
.NET
Data Warehouse (DWH)
icon
Remote, Bengaluru (Bangalore)
icon
4 - 8 yrs
icon
₹10L - ₹28L / yr

 

Job Description - Sr Azure Data Engineer

 

 

Roles & Responsibilities:

  1. Hands-on programming in C# / .Net,
  2. Develop serverless applications using Azure Function Apps.
  3. Writing complex SQL Queries, Stored procedures, and Views. 
  4. Creating Data processing pipeline(s).
  5. Develop / Manage large-scale Data Warehousing and Data processing solutions.
  6. Provide clean, usable data and recommend data efficiency, quality, and data integrity.

 

Skills

  1. Should have working experience on C# /.Net.
  2. Proficient with writing SQL queries, Stored Procedures, and Views
  3. Should have worked on Azure Cloud Stack.
  4. Should have working experience ofin developing serverless code.
  5. Must have MANDATORILY worked on Azure Data Factory.

 

Experience 

  1. 4+ years of relevant experience

 

Read more
Job posted by
Jayasimha Kulkarni

Data Engineer

at Easebuzz

Founded 2016  •  Product  •  100-500 employees  •  Raised funding
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
Data Analytics
Apache Kafka
SQL
Amazon Web Services (AWS)
Big Data
DynamoDB
MongoDB
EMR
Amazon Redshift
ETL
Data architecture
Data modeling
icon
Pune
icon
2 - 4 yrs
icon
₹2L - ₹20L / yr

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
Job posted by
Amala Baby

Technical Architect - MSBI

at Our Client company is into Computer Software. (EC1)

Agency job
via Multi Recruit
Microsoft Business Intelligence (MSBI)
Technical Architecture
Architecture
Software architecture
Data architecture
Business Intelligence (BI)
Azure
SSIS
icon
Bengaluru (Bangalore)
icon
14 - 18 yrs
icon
₹34.8L - ₹35L / yr
  • Provide thought leadership on how best to meet business requirements using the CDM platform 
  • Help determine the architectural direction of the CDM platform and related infrastructure 
  • Collaborate with business analysts and product owners to interpret and synthesize requirements  
  • Work with other team members to determine the best technical solution to meet requirements. 
  • Keep current on industry best practices and help to incorporate relevant updates into solutions. 
  • Analyze and Identify existing architecture and come up with suggestions and implement changes to bring solutions to a more stable state.
  • Act as a single point of contact to the team (including dev/QA/support) in guiding to adopt best practices across technology development.

Basic Qualifications: 

  • Bachelor’s degree in computer science, information technology, or related field 
  • Strong problem-solving, analytical, and critical thinking skills 
  • 10+ years working in an IT environment (Preferably on Digital or DWH/ETL applications)
  • 5+ years Java/.net (or similar) application design and development experience 
  • 4+ years Tibco EBX5 architect/design/development experience
  • 5+ years database design/development experience in MS SQL Server or similar DBs with query tuning and optimization
  • Proficiency in SQL, XML, JSON 
  • Min 5 years working experience an agile development team 
  • Min 3 to 5 years of exp with Azure Cloud including services like VMs, Security, Storage, App Services, Service Bus, Active Directory, Azure SQL

Preferred Qualifications: 

  • Experience in ETL, DW/BI, modeling, and manipulating data using Microsoft Access, SQL, Tableau, or similar tools 
  • Knowledge of MDM (Master data management) foundational concepts 
  • Excellent verbal and written communication skills that allow effective communication with all levels of the organization 
  • Experience working in a heterogeneous technology environment 
  • Ability to participate in high-level architecture design as well as hands-on development 
  • Experience developing applications in the Salesforce platform 
  • Exposure to web applications and web portals is added advantage

 

 

Read more
Job posted by
Manjunath Multirecruit

Software Architect/Solution Architect/CTO

at Nexus adwords

Founded 2020  •  Products & Services  •  20-100 employees  •  Raised funding
Java
Python
Javascript
Amazon Web Services (AWS)
Go Programming (Golang)
Web application security
Application lifecycle management
azure
icon
Ahmedabad
icon
7 - 15 yrs
icon
₹9L - ₹15L / yr
• Job Title:- Software Architect
• Location:- C.G Road, Ahmedabad 
• Working days:- 5 days, 2 Saturday alternate holidays.
• Experience:- Mini. 8yrs

Job Description:-

We are looking for a Software Architect to drive technology strategy, create the
technological vision and to ensure the designing and development of software solutions that fulfils the business requirements. You will be a key contributor to architectural decisions for products, drawing on your excellent technical, analytical and business acumen skills while effectively communicating with all levels in the organization to build high scalable and secure solutions.

KRA:-
Articulate architecture & Non functional requirements for the products and service with high precision. Gathering business requirements to analyse, identify, design and innovate solutions.
• Device strategy to implement NFRs
• Validate the design, development to confirm against the architecture and NFR
• Technical owner of the IT Projects
• Design and develop best practises in software development and architecture together with the team.
• Determining overall architectural principles, frameworks and standards.
• To provide hands-on development wherever appropriate specially on architecture transformation projects.
• Involve in unit testing, code reviews and bug fixing.
• Driving research, case studies on how latest technologies could be leveraged for software architecture and capabilities such as scalability, fault tolerance, extensibility, maintainability, etc.
• Documenting designs, estimates and implementation plans to iAND stakeholders.




Requirement:-
• Academics exposureExperience in architecting and designing technical solutions especially in area of mobile and cloud. for SaaS capabilities.
• Ability in scaling products and to tackle large traffic and amounts of data.
• Academics expertise in coding, programming and software design patterns.
• Passionate about technology and constantly growing your technical expertise.
Great to Have:-
• Professional Coding proficiency in Python and JavaScript. certifications like AWS, MS Azure
• Experience in insurance domain
Read more
Job posted by
Vinny Patel
Cloud Engineer
Ansible
Docker
Google Cloud Platform (GCP)
Windows Azure
AWS
Amazon Web Services (AWS)
icon
Remote, Coimbatore
icon
4 - 10 yrs
icon
₹10L - ₹20L / yr

We are looking for  4+ years of experience as a Cloud Engineer or Administrator able to demonstrate in-depth knowledge of the cloud computing market, enterprise, and open-source technologies.

- Azure AZ-104 Certified Azure Administrator, AWS  Certification or GCP Professional Certification required

 Advanced scripting (Bash, PowerShell, Python) .E
experience in automation, configuration, etc. Ideally has experience with Puppet, Chef, Ansible, or equivalent.

 Container technologies (Kubernetes, Rancher, Docker, CoreOS)

 Experience in support and configuration of systems and application monitoring services (CloudWatch, Prometheus, StackDriver, Azure Monitor)

 Experience with cloud management and governance technologies (CloudHealth, CloudCheckr)

Experience using, and configuring ITSM systems (Zendesk, ServiceNow)

Familiarity with running mission-critical infrastructure within 24x7 production environment

Read more
Job posted by
Jino J

Data Engineer

at Codalyze Technologies

Founded 2016  •  Products & Services  •  20-100 employees  •  Profitable
Apache Hive
Hadoop
Scala
Spark
Amazon Web Services (AWS)
Java
Python
icon
Mumbai
icon
3 - 9 yrs
icon
₹5L - ₹12L / yr
Job Overview :

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.

Responsibilities and Duties :

- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.

- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions

Education level :

- Bachelor's degree in Computer Science or equivalent

Experience :

- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development

- Expertise in application, data and infrastructure architecture disciplines

- Expert designing data integrations using ETL and other data integration patterns

- Advanced knowledge of architecture, design and business processes

Proficiency in :

- Modern programming languages like Java, Python, Scala

- Big Data technologies Hadoop, Spark, HIVE, Kafka

- Writing decently optimized SQL queries

- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)

- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions

- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.

- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.

- Experience generating physical data models and the associated DDL from logical data models.

- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.

- Experience enforcing data modeling standards and procedures.

- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.

- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals

Skills :

Must Know :

- Core big-data concepts

- Spark - PySpark/Scala

- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)

- Handling of various file formats

- Cloud platform - AWS/Azure/GCP

- Orchestration tool - Airflow
Read more
Job posted by
Aishwarya Hire
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at They platform powered by machine learning. (TE1)?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort