Cutshort logo
Google Cloud Platform (GCP) Jobs in Pune

50+ Google Cloud Platform (GCP) Jobs in Pune | Google Cloud Platform (GCP) Job openings in Pune

Apply to 50+ Google Cloud Platform (GCP) Jobs in Pune on CutShort.io. Explore the latest Google Cloud Platform (GCP) Job opportunities across top companies like Google, Amazon & Adobe.

icon
Lean Technologies

at Lean Technologies

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Pune
10yrs+
Upto ₹60L / yr (Varies
)
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+8 more

About Lean Technologies

Lean is on a mission to revolutionize the fintech industry by providing developers with a universal API to access their customers' financial accounts across the Middle East. We’re breaking down infrastructure barriers and empowering the growth of the fintech industry. With Sequoia leading our $33 million Series A round, Lean is poised to expand its coverage across the region while continuing to deliver unparalleled value to developers and stakeholders.

Join us and be part of a journey to enable the next generation of financial innovation. We offer competitive salaries, private healthcare, flexible office hours, and meaningful equity stakes to ensure long-term alignment. At Lean, you'll work on solving complex problems, build a lasting legacy, and be part of a diverse, inclusive, and equal opportunity workplace.


About the role:

Are you a highly motivated and experienced software engineer looking to take your career to the next level? Our team at Lean is seeking a talented engineer to help us build the distributed systems that allow our engineering teams to deploy our platform in multiple geographies across various deployment solutions. You will work closely with functional heads across software, QA, and product teams to deliver scalable and customizable release pipelines.


Responsibilities

  • Distributed systems architecture – understand and manage the most complex systems
  • Continual reliability and performance optimization – enhancing observability stack to improve proactive detection and resolution of issues
  • Employing cutting-edge methods and technologies, continually refining existing tools to enhance performance and drive advancements
  • Problem-solving capabilities – troubleshooting complex issues and proactively reducing toil through automation
  • Experience in technical leadership and setting technical direction for engineering projects
  • Collaboration skills – working across teams to drive change and provide guidance
  • Technical expertise – depth skills and ability to act as subject matter expert in one or more of: IAAC, observability, coding, reliability, debugging, system design
  • Capacity planning – effectively forecasting demand and reacting to changes
  • Analyze and improve efficiency, scalability, and stability of various system resources
  • Incident response – rapidly detecting and resolving critical incidents. Minimizing customer impact through effective collaboration, escalation (including periodic on-call shifts) and postmortems


Requirements

  • 10+ years of experience in Systems Engineering, DevOps, or SRE roles running large-scale infrastructure, cloud, or web services
  • Strong background in Linux/Unix Administration and networking concepts
  • We work on OCI but would accept candidates with solid GCP/AWS or other cloud providers’ knowledge and experience
  • 3+ years of experience with managing Kubernetes clusters, Helm, Docker
  • Experience in operating CI/CD pipelines that build and deliver services on the cloud and on-premise
  • Work with CI/CD tools/services like Jenkins/GitHub-Actions/ArgoCD etc.
  • Experience with configuration management tools either Ansible, Chef, Puppet, or equivalent
  • Infrastructure as Code - Terraform
  • Experience in production environments with both relational and NoSQL databases
  • Coding with one or more of the following: Java, Python, and/or Go


Bonus

  • MultiCloud or Hybrid Cloud experience
  • OCI and GCP


Why Join Us?

At Lean, we value talent, drive, and entrepreneurial spirit. We are constantly on the lookout for individuals who identify with our mission and values, even if they don’t meet every requirement. If you're passionate about solving hard problems and building a legacy, Lean is the right place for you. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, gender, sexual orientation, or disability.

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Remote, Pune
2 - 6 yrs
₹8L - ₹25L / yr
SQL Azure
databricks
skill iconPython
SQL
ETL
+9 more

TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes TVARIT one of the most innovative AI companies in Germany and Europe.


We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.


We are seeking a skilled and motivated senior Data Engineer from the manufacturing Industry with over four years of experience to join our team. The Senior Data Engineer will oversee the department’s data infrastructure, including developing a data model, integrating large amounts of data from different systems, building & enhancing a data lake-house & subsequent analytics environment, and writing scripts to facilitate data analysis. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.


Skills Required:


  • Experience in the manufacturing industry (metal industry is a plus)
  • 4+ years of experience as a Data Engineer
  • Experience in data cleaning & structuring and data manipulation
  • Architect and optimize complex data pipelines, leading the design and implementation of scalable data infrastructure, and ensuring data quality and reliability at scale
  • ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
  • Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
  • Experience in SQL and data structures
  • Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache, and NoSQL databases.
  • Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
  • Proficient in data management and data governance
  • Strong analytical experience & skills that can extract actionable insights from raw data to help improve the business.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.


Nice To Have:

  • Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
  • Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
  • Bachelor’s degree in computer science, Information Technology, Engineering, or a related field from top-tier Indian Institutes of Information Technology (IIITs).
  • Benefits And Perks
  • A culture that fosters innovation, creativity, continuous learning, and resilience
  • Progressive leave policy promoting work-life balance
  • Mentorship opportunities with highly qualified internal resources and industry-driven programs
  • Multicultural peer groups and supportive workplace policies
  • Annual workcation program allowing you to work from various scenic locations
  • Experience the unique environment of a dynamic start-up


Why should you join TVARIT ?


Working at TVARIT, a deep-tech German IT startup, offers a unique blend of innovation, collaboration, and growth opportunities. We seek individuals eager to adapt and thrive in a rapidly evolving environment.


If this opportunity excites you and aligns with your career aspirations, we encourage you to apply today!

Read more
MangoApps

at MangoApps

29 recruiters
Dhhruval Modi
Posted by Dhhruval Modi
Pune
5 - 9 yrs
₹18L - ₹35L / yr
Microsoft Windows Azure
Google Cloud Platform (GCP)
Terraform
CloudWatch
Linux/Unix

About the job


MangoApps builds enterprise products that make employees at organizations across the globe

more effective and productive in their day-to-day work. We seek tech pros, great


communicators, collaborators, and efficient team players for this role.


Job Description:


Experience: 5+yrs (Relevant experience as a SRE)


Open positions: 2


Job Responsibilities as a SRE


  • Must have very strong experience in Linux (Ubuntu) administration
  • Strong in network troubleshooting
  • Experienced in handling and diagnosing the root cause of compute and database outages
  • Strong experience required with cloud platforms, specifically Azure or GCP (proficiency in at least one is mandatory)
  • Must have very strong experience in designing, implementing, and maintaining highly available and scalable systems
  • Must have expertise in CloudWatch or similar log systems and troubleshooting using them
  • Proficiency in scripting and programming languages such as Python, Go, or Bash is essential
  • Familiarity with configuration management tools such as Ansible, Puppet, or Chef is required
  • Must possess knowledge of database/SQL optimization and performance tuning.
  • Respond promptly to and resolve incidents to minimize downtime
  • Implement and manage infrastructure using IaC tools like Terraform, Ansible, or Cloud Formation
  • Excellent problem-solving skills with a proactive approach to identifying and resolving issues are essential.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Mumbai, Pune
7 - 20 yrs
Best in industry
skill icon.NET
ASP.NET
skill iconC#
Google Cloud Platform (GCP)
Migration

Job Title: .NET Developer with Cloud Migration Experience

Job Description:

We are seeking a skilled .NET Developer with experience in C#, MVC, and ASP.NET to join our team. The ideal candidate will also have hands-on experience with cloud migration projects, particularly in migrating on-premise applications to cloud platforms such as Azure or AWS.

Responsibilities:

  • Develop, test, and maintain .NET applications using C#, MVC, and ASP.NET
  • Collaborate with cross-functional teams to define, design, and ship new features
  • Participate in code reviews and ensure coding best practices are followed
  • Work closely with the infrastructure team to migrate on-premise applications to the cloud
  • Troubleshoot and debug issues that arise during migration and post-migration phases
  • Stay updated with the latest trends and technologies in .NET development and cloud computing

Requirements:

  • Bachelor's degree in Computer Science or related field
  • X+ years of experience in .NET development using C#, MVC, and ASP.NET
  • Hands-on experience with cloud migration projects, preferably with Azure or AWS
  • Strong understanding of cloud computing concepts and principles
  • Experience with database technologies such as SQL Server
  • Excellent problem-solving and communication skills

Preferred Qualifications:

  • Microsoft Azure or AWS certification
  • Experience with other cloud platforms such as Google Cloud Platform (GCP)
  • Familiarity with DevOps practices and tools


Read more
Apptware solutions LLP Pune
Pune
6 - 10 yrs
₹9L - ₹15L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+5 more

Company - Apptware Solutions

Location Baner Pune

Team Size - 130+


Job Description -

Cloud Engineer with 8+yrs of experience


Roles and Responsibilities


● Have 8+ years of strong experience in deployment, management and maintenance of large systems on-premise or cloud

● Experience maintaining and deploying highly-available, fault-tolerant systems at scale

● A drive towards automating repetitive tasks (e.g. scripting via Bash, Python, Ruby, etc)

● Practical experience with Docker containerization and clustering (Kubernetes/ECS)

● Expertise with AWS (e.g. IAM, EC2, VPC, ELB, ALB, Autoscaling, Lambda, VPN)

● Version control system experience (e.g. Git)

● Experience implementing CI/CD (e.g. Jenkins, TravisCI, CodePipeline)

● Operational (e.g. HA/Backups) NoSQL experience (e.g. MongoDB, Redis) SQL experience (e.g. MySQL)

● Experience with configuration management tools (e.g. Ansible, Chef) ● Experience with infrastructure-as-code (e.g. Terraform, Cloudformation)

● Bachelor's or master’s degree in CS, or equivalent practical experience

● Effective communication skills

● Hands-on cloud providers like MS Azure and GC

● A sense of ownership and ability to operate independently

● Experience with Jira and one or more Agile SDLC methodologies

● Nice to Have:

○ Sensu and Graphite

○ Ruby or Java

○ Python or Groovy

○ Java Performance Analysis


Role: Cloud Engineer

Industry Type: IT-Software, Software Services

Functional Area: IT Software - Application Programming, Maintenance Employment Type: Full Time, Permanent

Role Category: Programming & Design

Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
Arahas Technologies
Nidhi Shivane
Posted by Nidhi Shivane
Pune
3 - 8 yrs
₹10L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+3 more


Role Description

This is a full-time hybrid role as a GCP Data Engineer,. As a GCP Data Engineer, you will be responsible for managing large sets of structured and unstructured data and developing processes to convert data into insights, information, and knowledge.

Skill Name: GCP Data Engineer

Experience: 7-10 years

Notice Period: 0-15 days

Location :-Pune

If you have a passion for data engineering and possess the following , we would love to hear from you:


🔹 7 to 10 years of experience working on Software Development Life Cycle (SDLC)

🔹 At least 4+ years of experience in Google Cloud platform, with a focus on Big Query

🔹 Proficiency in Java and Python, along with experience in Google Cloud SDK & API Scripting

🔹 Experience in the Finance/Revenue domain would be considered an added advantage

🔹 Familiarity with GCP Migration activities and the DBT Tool would also be beneficial


You will play a crucial role in developing and maintaining our data infrastructure on the Google Cloud platform.

Your expertise in SDLC, Big Query, Java, Python, and Google Cloud SDK & API Scripting will be instrumental in ensuring the smooth operation of our data systems..


Join our dynamic team and contribute to our mission of harnessing the power of data to make informed business decisions.

Read more
Ignite Solutions

at Ignite Solutions

6 recruiters
Meghana Dhamale
Posted by Meghana Dhamale
Remote, Pune
5 - 7 yrs
₹15L - ₹20L / yr
skill iconPython
LinkedIn
skill iconDjango
skill iconFlask
skill iconAmazon Web Services (AWS)
+2 more

We are looking for a hands-on technical expert who has worked with multiple technology stacks and has experience architecting and building scalable cloud solutions with web and mobile frontends. 

 What will you work on?

  •  Interface with clients
  • Recommend tech stacks
  • Define end-to-end logical and cloud-native architectures
  •  Define APIs
  • Integrate with 3rd party systems
  • Create architectural solution prototypes
  • Hands-on coding, team lead, code reviews, and problem-solving

What Makes You A Great Fit?

  • 5+ years of software experience 
  • Experience with architecture of technology systems having hands-on expertise in backend, and web or mobile frontend
  • Solid expertise and hands-on experience in Python with Flask or Django
  • Expertise on one or more cloud platforms (AWS, Azure, Google App Engine)
  • Expertise with SQL and NoSQL databases (MySQL, Mongo, ElasticSearch, Redis)
  • Knowledge of DevOps practices
  • Chatbot, Machine Learning, Data Science/Big Data experience will be a plus
  • Excellent communication skills, verbal and written

The job is for a full-time position at our https://goo.gl/maps/o67FWr1aedo">Pune (Viman Nagar) office. 

(Note: We are working remotely at the moment. However, once the COVID situation improves, the candidate will be expected to work from our office.)

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
4 - 8 yrs
Best in industry
Data Warehouse (DWH)
Informatica
ETL
SQL
Google Cloud Platform (GCP)
+3 more

Who We Are:

DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data.

What You’ll Do:

We are looking for a Senior Software Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.  

This role will be in the Analytics Organization and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.  

  • Serve as the Engineering interface between Analytics and Engineering teams
  • Develop and standardized all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data based decisioning
  • Optimize queries and data access efficiencies, serve as expert in how to most efficiently attain desired data points
  • Build “mastered” versions of the data for Analytics specific querying use cases
  • Help with data ETL, table performance optimization
  • Establish formal data practice for the Analytics practice in conjunction with rest of DeepIntent
  • Build & operate scalable and robust data architectures
  • Interpret analytics methodology requirements and apply to data architecture to create standardized queries and operations for use by analytics teams
  • Implement DataOps practices
  • Master existing and new Data Pipelines and develop appropriate queries to meet analytics specific objectives
  • Collaborate with various business stakeholders, software engineers, machine learning engineers, analysts
  • Operate between Engineers and Analysts to unify both practices for analytics insight creation

Who You Are:

  • Adept in market research methodologies and using data to deliver representative insights
  • Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases
  • Deep SQL experience is a must
  • Exceptional communication skills with ability to collaborate and translate with between technical and non technical needs
  • English Language Fluency and proven success working with teams in the U.S.
  • Experience in designing, developing and operating configurable Data pipelines serving high volume and velocity data
  • Experience working with public clouds like GCP/AWS
  • Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies
  • Experience building Data architectures that optimize performance and cost, whether the components are prepackaged or homegrown
  • Proficient with SQL,Python or JVM based language, Bash
  • Experience with any of Apache open source projects such as Spark, Druid, Beam, Airflow etc.and big data databases like BigQuery, Clickhouse, etc         
  • Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious
  • Comfortable to work in EST Time Zone


Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Hyderabad, Pune, Noida, Bengaluru (Bangalore), Chennai
4 - 10 yrs
Best in industry
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
Windows Azure

Golang Developer

Location: Chennai/ Hyderabad/Pune/Noida/Bangalore

Experience: 4+ years

Notice Period: Immediate/ 15 days

Job Description:

  • Must have at least 3 years of experience working with Golang.
  • Strong Cloud experience is required for day-to-day work.
  • Experience with the Go programming language is necessary.
  • Good communication skills are a plus.
  • Skills- Aws, Gcp, Azure, Golang
Read more
Tredence
Rohit S
Posted by Rohit S
Chennai, Pune, Bengaluru (Bangalore), Gurugram
11 - 16 yrs
₹20L - ₹32L / yr
Data Warehouse (DWH)
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Data engineering
Data migration
+1 more
• Engages with Leadership of Tredence’s clients to identify critical business problems, define the need for data engineering solutions and build strategy and roadmap
• S/he possesses a wide exposure to complete lifecycle of data starting from creation to consumption
• S/he has in the past built repeatable tools / data-models to solve specific business problems
• S/he should have hand-on experience of having worked on projects (either as a consultant or with in a company) that needed them to
o Provide consultation to senior client personnel o Implement and enhance data warehouses or data lakes.
o Worked with business teams or was a part of the team that implemented process re-engineering driven by data analytics/insights
• Should have deep appreciation of how data can be used in decision-making
• Should have perspective on newer ways of solving business problems. E.g. external data, innovative techniques, newer technology
• S/he must have a solution-creation mindset.
Ability to design and enhance scalable data platforms to address the business need
• Working experience on data engineering tool for one or more cloud platforms -Snowflake, AWS/Azure/GCP
• Engage with technology teams from Tredence and Clients to create last mile connectivity of the solutions
o Should have experience of working with technology teams
• Demonstrated ability in thought leadership – Articles/White Papers/Interviews
Mandatory Skills Program Management, Data Warehouse, Data Lake, Analytics, Cloud Platform
Read more
one of the world's leading multinational investment bank

one of the world's leading multinational investment bank

Agency job
via HiyaMee by Lithin Raj
Pune
9 - 13 yrs
₹10L - ₹15L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more
  • Hands-on knowledge on various CI-CD tools (Jenkins/TeamCity, Artifactory, UCD, Bitbucket/Github, SonarQube) including setting up of build-deployment automated pipelines.
  • Very good knowledge in scripting tools and languages such as Shell, Perl or Python , YAML/Groovy, build tools such as Maven/Gradle.
  • Hands-on knowledge in containerization and orchestration tools such as Docker, OpenShift and Kubernetes.
  • Good knowledge in configuration management tools such as Ansible, Puppet/Chef and have worked on setting up of monitoring tools (Splunk/Geneos/New Relic/Elk).
  •             Expertise in job schedulers/workload automation tools such as Control-M or AutoSys is good to have.
  • Hands-on knowledge on Cloud technology (preferably GCP) including various computing services and infrastructure setup using Terraform.
  • Should have basic understanding on networking, certificate management, Identity and Access Management and Information security/encryption concepts.
  • •             Should support day-to-day tasks related to platform and environments upkeep such as upgrades, patching, migration and system/interfaces integration.
  • •             Should have experience in working in Agile based SDLC delivery model, multi-task and support multiple systems/apps.
  • •             Big-data and Hadoop ecosystem knowledge is good to have but not mandatory.
  • Should have worked on standard release, change and incident management tools such as ServiceNow/Remedy or similar
Read more
MNC

MNC

Agency job
via Bohiyaanam Talent Solutions by Harsha Manglani
Pune
6 - 9 yrs
₹1L - ₹25L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)

We are hiring for Devops Engineer for a reputed MNC

 

Job Description:

Total exp- 6+Years

Must have:

Minimum 3-4 years hands-on experience in Kubernetes and Docker

Proficiency in AWS Cloud

Good to have Kubernetes admin certification

 

Job Responsibilities:

Responsible for managing Kubernetes cluster

Deploying infrastructure for the project

Build CICD pipeline

 

Looking for Immediate Joiners only

Location: Pune

Salary: As per market standards

Mode: Work from office



 
Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
marsdevs.com
Vishvajit Pathak
Posted by Vishvajit Pathak
Remote, Pune
2 - 5 yrs
₹4L - ₹15L / yr
skill iconDjango
skill iconFlask
FastAPI
skill iconPython
skill iconDocker
+3 more

We are having an immediate requirement for a Python web developer.

 

You have:

  • At least 2 years of experience developing web applications with Django/Flask/FastAPI
  • Familiarity with Linux
  • Experience in both SQL and NoSQL databases.
  • Uses Docker and CI/CD
  • Writes tests
  • Experienced in application deployments and scaling them on AWS or GCP

 

You are:

  • Eager to work independently without being watched
  • Easy going.
  • Able to handle clients on your own

 

Location: Remote (in India)

 

Read more
Anetcorp Ind Pvt Ltd
Jyoti Yadav
Posted by Jyoti Yadav
Remote, Pune
6 - 12 yrs
₹10L - ₹25L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more
  • Essentail Skills:
    • Docker
    • Jenkins
    • Python dependency management using conda and pip
  • Base Linux System Commands, Scripting
  • Docker Container Build & Testing
    • Common knowledge of minimizing container size and layers
    • Inspecting containers for un-used / underutilized systems
    • Multiple Linux OS support for virtual system
  • Has experience as a user of jupyter / jupyter lab to test and fix usability issues in workbenches
  • Templating out various configurations for different use cases (we use Python Jinja2 but are open to other languages / libraries)
  • Jenkins PIpeline
  • Github API Understanding to trigger builds, tags, releases
  • Artifactory Experience
  • Nice to have: Kubernetes, ArgoCD, other deployment automation tool sets (DevOps)
Read more
DREAMS Pvt Ltd
Siddhant Malani
Posted by Siddhant Malani
Pune, Bengaluru (Bangalore)
0 - 1 yrs
₹8000 - ₹12000 / mo
skill iconFlutter
DART
User Interface (UI) Design
User Experience (UX) Design
RESTful APIs
+3 more

Role Description for the 3 month internship:-

• Create multi-platform apps for iOS & Android using Google's new Flutter development framework
• Strong OO design and programming skills in DART and SDK Framework for building Android as well as iOS Apps.
• Good expertise in Auto Layout and adding constraints programmatically
• Must have experience of Memory management, caching mechanisms., Threading and Performance tuning.
• Familiarity with RESTful APIs to connect Android & iOS applications to back-end services
• Experience with third-party libraries and APIs
• Collaborate with the team of product managers, developers, to define, design, & deploy new features & functionality
• Build software that ensures the best possible usability, performance, quality, & responsiveness of features
• Work in a team following agile development practices (Scrum)
• Proficient understanding of code versioning tools such as Git, Mercurial, or SVN, and Project Management tool (JIRA)
• Utilize your knowledge of the general mobile landscape, architectures, trends, & emerging technologies
• Get Solid understanding of full mobile development life cycle and make use of the same
• Help Develop and Deploy Good Quality UI
• Solid understanding of the full mobile development life cycle.
• Good written, verbal, organizational and interpersonal skills
• Unit-test code for robustness, including edge cases, usability, and general reliability.
• Excellent debugging and optimization skills
• Strong design, development and debugging skills.

Read more
Senwell Solutions

at Senwell Solutions

1 recruiter
Trupti Gholap
Posted by Trupti Gholap
Pune
1 - 3 yrs
₹2L - ₹7L / yr
skill iconAngular (2+)
skill iconAngularJS (1.x)
skill iconAmazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
+3 more

We are looking to hire an experienced Sr. Angular Developer to join our dynamic team. As a lead developer, you will be responsible for creating a top-level coding base using Angular best practices. To ensure success as an angular developer, you should have extensive knowledge of theoretical software engineering, be proficient in TypeScript, JavaScript, HTML, and CSS, and have excellent project management skills. Ultimately, a top-class Angular Developer can design and build a streamlined application to company specifications that perfectly meet the needs of the user.

 

Requirements:

 

  1. Bachelor’s degree in computer science, computer engineering, or similar
  2. Previous work Experience 2+ years as an Angular developer.
  3. Proficient in CSS, HTML, and writing cross-browser compatible code
  4. Experience using JavaScript & TypeScript building tools like Gulp or Grunt.
  5. Knowledge of JavaScript MV-VM/MVC frameworks including Angluar.JS / React.
  6. Excellent project management skills.

 

Responsibilities:

 

  1. Designing and developing user interfaces using Angular best practices.
  2. Adapting interface for modern internet applications using the latest front-end technologies.
  3. Writing TypeScript, JavaScript, CSS, and HTML.
  4. Developing product analysis tasks.
  5. Making complex technical and design decisions for Angular.JS projects.
  6. Developing application codes in Angular, Node.js, and Rest Web Services.
  7. Conducting performance tests.
  8. Consulting with the design team.
  9. Ensuring high performance of applications and providing support.

 

Read more
This company provides on-demand cloud computing platforms.

This company provides on-demand cloud computing platforms.

Agency job
via New Era India by Niharica Singh
Remote, Pune, Mumbai, Bengaluru (Bangalore), Gurugram, Hyderabad
15 - 25 yrs
₹35L - ₹55L / yr
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
Windows Azure
Architecture
skill iconPython
+5 more
  • 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
  • 15+ years of experience as a technical specialist in Customer-facing roles.
  • Ability to travel to client locations as needed (25-50%)
  • Extensive experience architecting, designing and programming applications in an AWS Cloud environment
  • Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
  • Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
  • Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
  • Agile software development expert
  • Experience with continuous integration tools (e.g. Jenkins)
  • Hands-on familiarity with CloudFormation
  • Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
  • Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
  • Strong practical application development experience on Linux and Windows-based systems
  • Extra curricula software development passion (e.g. active open source contributor)
Read more
InFoCusp

at InFoCusp

3 recruiters
Apurva Gayawal
Posted by Apurva Gayawal
Pune, Ahmedabad
3 - 7 yrs
₹7L - ₹27L / yr
skill iconJavascript
Cloud Computing
skill iconReact.js
skill iconPython
skill iconAmazon Web Services (AWS)
+4 more
InFoCusp is a company working in the broad field of Computer Science, Software Engineering,
and Artificial Intelligence (AI). It is headquartered in Ahmedabad, India, having a branch office in
Pune.

We have worked on / are working on Software Engineering projects that touch upon making
full-fledged products. Starting from UI/UX aspects, responsive and blazing fast front-ends,
platform-specific applications (Android, iOS, web applications, desktop applications), very
large scale infrastructure, cutting edge machine learning, and deep learning (AI in general).
The projects/products have wide-ranging applications in finance, healthcare, e-commerce,
legal, HR/recruiting, pharmaceutical, leisure sports and computer gaming domains. All of this
is using core concepts of computer science such as distributed systems, operating systems,
computer networks, process parallelism, cloud computing, embedded systems and the
Internet of Things.

PRIMARY RESPONSIBILITIES:
● Own the design, development, evaluation and deployment of highly-scalable software
products involving front-end and back-end development.
● Maintain quality, responsiveness and stability of the system.
● Design and develop memory-efficient, compute-optimized solutions for the
software.
● Design and administer automated testing tools and continuous integration
tools.
● Produce comprehensive and usable software documentation.
● Evaluate and make decisions on the use of new tools and technologies.
● Mentor other development engineers.

KNOWLEDGE AND SKILL REQUIREMENTS:
● Mastery of one or more back-end programming languages (Python, Java, Scala, C++
etc.)
● Proficiency in front-end programming paradigms and libraries (for example : HTML,
CSS and advanced JavaScript libraries and frameworks such as Angular, Knockout,
React). - Knowledge of automated and continuous integration testing tools (Jenkins,
Team City, Circle CI etc.)
● Proven experience of platform-level development for large-scale systems.
● Deep understanding of various database systems (MySQL, Mongo,
Cassandra).
● Ability to plan and design software system architecture.
● Development experience for mobile, browsers and desktop systems is
desired.
● Knowledge and experience of using distributed systems (Hadoop, Spark)
and cloud environments (Amazon EC2, Google Compute Engine, Microsoft
Azure).
● Experience working in agile development. Knowledge and prior experience of tools
like Jira is desired.
● Experience with version control systems (Git, Subversion or Mercurial).
Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Pooja Singh
Posted by Pooja Singh
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Hyderabad, Pune
4 - 19 yrs
₹1L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Microservices
+7 more
  • Experience building large scale, large volume services & distributed apps., taking them through production and post-production life cycles
  • Experience in Programming Language: Java 8, Javascript
  • Experience in Microservice Development or Architecture
  • Experience with Web Application Frameworks: Spring or Springboot or Micronaut
  • Designing: High Level/Low-Level Design
  • Development Experience: Agile/ Scrum, TDD(Test Driven Development)or BDD (Behaviour Driven Development) Plus Unit Testing
  • Infrastructure Experience: DevOps, CI/CD Pipeline, Docker/ Kubernetes/Jenkins, and Cloud platforms like – AWS, AZURE, GCP, etc
  • Experience on one or more Database: RDBMS or NoSQL
  • Experience on one or more Messaging platforms: JMS/RabbitMQ/Kafka/Tibco/Camel
  • Security (Authentication, scalability, performance monitoring)
Read more
Abishar Technologies

at Abishar Technologies

1 recruiter
Chandra Goswami
Posted by Chandra Goswami
Pune
6 - 10 yrs
₹8L - ₹23L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

Role and responsibilities

 

  • Expertise in AWS (Most typical Services),Docker & Kubernetes.
  • Strong Scripting knowledge, Strong Devops Automation, Good at Linux
  • Hands on with CI/CD (CircleCI preferred but any CI/CD tool will do). Strong Understanding of GitHub
  • Strong understanding of AWS networking and. Strong with Security & Certificates.

Nice-to-have skills

  • Involved in Product Engineering
Read more
Leading Payment Solution Company

Leading Payment Solution Company

Agency job
via People First Consultants by Aishwarya KA
Chennai, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai
9 - 16 yrs
Best in industry
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
+9 more

About Company:

The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.

  • Role Overview
    • Senior Engineer with a strong background and experience in cloud related technologies and architectures. Can design target cloud architectures to transform existing architectures together with the in-house team. Can actively hands-on configure and build cloud architectures and guide others.
  • Key Knowledge
    • 3-5+ years of experience in AWS/GCP or Azure technologies
    • Is likely certified on one or more of the major cloud platforms
    • Strong experience from hands-on work with technologies such as Terraform, K8S, Docker and orchestration of containers.
    • Ability to guide and lead internal agile teams on cloud technology
    • Background from the financial services industry or similar critical operational experience
Read more
Leading Payment Solution Company

Leading Payment Solution Company

Agency job
Remote, Bengaluru (Bangalore), Chennai, Pune, Hyderabad, Mumbai
3 - 10 yrs
₹8L - ₹28L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more

Experience: 3+ years of experience in Cloud Architecture

About Company:

The company is a global leader in secure payments and trusted transactions. They are at the forefront of the digital revolution that is shaping new ways of paying, living, doing business and building relationships that pass on trust along the entire payments value chain, enabling sustainable economic growth. Their innovative solutions, rooted in a rock-solid technological base, are environmentally friendly, widely accessible and support social transformation.



Cloud Architect / Lead

  • Role Overview
    • Senior Engineer with a strong background and experience in cloud related technologies and architectures. Can design target cloud architectures to transform existing architectures together with the in-house team. Can actively hands-on configure and build cloud architectures and guide others.
  • Key Knowledge
    • 3-5+ years of experience in AWS/GCP or Azure technologies
    • Is likely certified on one or more of the major cloud platforms
    • Strong experience from hands-on work with technologies such as Terraform, K8S, Docker and orchestration of containers.
    • Ability to guide and lead internal agile teams on cloud technology
    • Background from the financial services industry or similar critical operational experience
 
Read more
Intuitive Technology Partners
Aakriti Gupta
Posted by Aakriti Gupta
Remote, Ahmedabad, Pune, Gurugram, Chennai, Bengaluru (Bangalore), india
6 - 12 yrs
Best in industry
DevOps
skill iconKubernetes
skill iconDocker
Terraform
Linux/Unix
+10 more

Intuitive is the fastest growing top-tier Cloud Solutions and Services company supporting Global Enterprise Customer across Americas, Europe and Middle East.

Intuitive is looking for highly talented hands-on Cloud Infrastructure Architects to help accelerate our growing Professional Services consulting Cloud & DevOps practice. This is an excellent opportunity to join Intuitive’s global world class technology teams, working with some of the best and brightest engineers while also developing your skills and furthering your career working with some of the largest customers.

Job Description :

  • Extensive exp. with K8s (EKS/GKE) and k8s eco-system tooling e,g., Prometheus, ArgoCD, Grafana, Istio etc.
  • Extensive AWS/GCP Core Infrastructure skills
  • Infrastructure/ IAC Automation, Integration - Terraform
  • Kubernetes resources engineering and management
  • Experience with DevOps tools, CICD pipelines and release management
  • Good at creating documentation(runbooks, design documents, implementation plans )

Linux Experience :

  1. Namespace
  2. Virtualization
  3. Containers

 

Networking Experience

  1. Virtual networking
  2. Overlay networks
  3. Vxlans, GRE

 

Kubernetes Experience :

Should have experience in bringing up the Kubernetes cluster manually without using kubeadm tool.

 

Observability                              

Experience in observability is a plus

 

Cloud automation :

Familiarity with cloud platforms exclusively AWS, DevOps tools like Jenkins, terraform etc.

 

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
2 - 6 yrs
₹3L - ₹15L / yr
Google Cloud Platform (GCP)
SQL
BQ

Datametica is looking for talented Big Query engineers

 

Total Experience - 2+ yrs.

Notice Period – 0 - 30 days

Work Location – Pune, Hyderabad

 

Job Description:

  • Sound understanding of Google Cloud Platform Should have worked on Big Query, Workflow, or Composer
  • Experience in migrating to GCP and integration projects on large-scale environments ETL technical design, development, and support
  • Good SQL skills and Unix Scripting Programming experience with Python, Java, or Spark would be desirable.
  • Experience in SOA and services-based data solutions would be advantageous

 

About the Company: 

www.datametica.com

Datametica is amongst one of the world's leading Cloud and Big Data analytics companies.

Datametica was founded in 2013 and has grown at an accelerated pace within a short span of 8 years. We are providing a broad and capable set of services that encompass a vision of success, driven by innovation and value addition that helps organizations in making strategic decisions influencing business growth.

Datametica is the global leader in migrating Legacy Data Warehouses to the Cloud. Datametica moves Data Warehouses to Cloud faster, at a lower cost, and with few errors, even running in parallel with full data validation for months.

Datametica's specialized team of Data Scientists has implemented award-winning analytical models for use cases involving both unstructured and structured data.

Datametica has earned the highest level of partnership with Google, AWS, and Microsoft, which enables Datametica to deliver successful projects for clients across industry verticals at a global level, with teams deployed in the USA, EU, and APAC.

 

Recognition:

We are gratified to be recognized as a Top 10 Big Data Global Company by CIO story.

 

If it excites you, please apply.

Read more
Information Technology Services

Information Technology Services

Agency job
via Jobdost by Sathish Kumar
Pune
5 - 9 yrs
₹10L - ₹30L / yr
skill iconDocker
skill iconKubernetes
DevOps
skill iconAmazon Web Services (AWS)
Windows Azure
+8 more
Preferred Education & Experience: 
• Bachelor’s or master’s degree in Computer Engineering,
Computer Science, Computer Applications, Mathematics, Statistics or related technical field or
equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a
different stream of education.
• Well-versed in DevOps principals & practices and hands-on DevOps
tool-chain integration experience: Release Orchestration & Automation, Source Code & Build
Management, Code Quality & Security Management, Behavior Driven Development, Test Driven
Development, Continuous Integration, Continuous Delivery, Continuous Deployment, and
Operational Monitoring & Management; extra points if you can demonstrate your knowledge with
working examples.
• Hands-on experience with demonstrable working experience with DevOps tools
and platforms viz., Slack, Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory,
Terraform, Ansible/Chef/Puppet, Spinnaker, Tekton, StackStorm, Prometheus, Grafana, ELK,
PagerDuty, VictorOps, etc.
• Well-versed in Virtualization & Containerization; must demonstrate
experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox,
Vagrant, etc.
• Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate
experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in
any categories: Compute or Storage, Database, Networking & Content Delivery, Management &
Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstratable Cloud
Platform experience.
• Well-versed with demonstrable working experience with API Management,
API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption, tools &
platforms.
• Hands-on programming experience in either core Java and/or Python and/or JavaScript
and/or Scala; freshers passing out of college or lateral movers into IT must be able to code in
languages they have studied.
• Well-versed with Storage, Networks and Storage Networking basics
which will enable you to work in a Cloud environment.
• Well-versed with Network, Data, and
Application Security basics which will enable you to work in a Cloud as well as Business
Applications / API services environment.
• Extra points if you are certified in AWS and/or Azure
and/or Google Cloud.
Read more
Product based company specializes into architectural product

Product based company specializes into architectural product

Agency job
via Jobdost by Sathish Kumar
Pune, Hyderabad, Gandhinagar
6 - 12 yrs
₹5L - ₹21L / yr
skill iconAmazon Web Services (AWS)
skill iconDocker
skill iconKubernetes
DevOps
Windows Azure
+9 more

Key Skills Required:

 

·         You will be part of the DevOps engineering team, configuring project environments, troubleshooting integration issues in different systems also be involved in building new features for next generation of cloud recovery services and managed services. 

·         You will directly guide the technical strategy for our clients and build out a new capability within the company for DevOps to improve our business relevance for customers. 

·         You will be coordinating with Cloud and Data team for their requirements and verify the configurations required for each production server and come with Scalable solutions.

·         You will be responsible to review infrastructure and configuration of micro services and packaging and deployment of application

 

To be the right fit, you'll need:

 

·         Expert in Cloud Services like AWS.

·         Experience in Terraform Scripting.

·         Experience in container technology like Docker and orchestration like Kubernetes.

·         Good knowledge of frameworks such as JenkinsCI/CD pipeline, Bamboo Etc.

·         Experience with various version control system like GIT, build tools (Mavan, ANT, Gradle ) and cloud automation tools (Chef, Puppet, Ansible)

Read more
Information Technology Services

Information Technology Services

Agency job
via Jobdost by Sathish Kumar
Pune
5 - 8 yrs
₹10L - ₹30L / yr
skill iconJava
skill iconPython
skill iconJavascript
skill iconScala
skill iconDocker
+5 more
 Sr. DevOps Software Engineer:
Preferred Education & Experience:
Bachelor’s or master’s degree in Computer Engineering,
Computer Science, Computer Applications, Mathematics, Statistics or related technical field or
equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

• Well-versed in DevOps principals & practices and hands-on DevOps
tool-chain integration experience: Release Orchestration & Automation, Source Code & Build
Management, Code Quality & Security Management, Behavior Driven Development, Test Driven
Development, Continuous Integration, Continuous Delivery, Continuous Deployment, and
Operational Monitoring & Management; extra points if you can demonstrate your knowledge with
working examples.
• Hands-on experience with demonstrable working experience with DevOps tools
and platforms viz., Slack, Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory,
Terraform, Ansible/Chef/Puppet, Spinnaker, Tekton, StackStorm, Prometheus, Grafana, ELK,
PagerDuty, VictorOps, etc.
• Well-versed in Virtualization & Containerization; must demonstrate
experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox,
Vagrant, etc.
• Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate
experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in
any categories: Compute or Storage, Database, Networking & Content Delivery, Management &
Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstratable Cloud
Platform experience.
• Well-versed with demonstrable working experience with API Management,
API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption, tools &
platforms.
• Hands-on programming experience in either core Java and/or Python and/or JavaScript
and/or Scala; freshers passing out of college or lateral movers into IT must be able to code in
languages they have studied.
• Well-versed with Storage, Networks and Storage Networking basics
which will enable you to work in a Cloud environment.
• Well-versed with Network, Data, and
Application Security basics which will enable you to work in a Cloud as well as Business
Applications / API services environment.
• Extra points if you are certified in AWS and/or Azure
and/or Google Cloud.
Required Experience: 5+ Years
Job Location: Remote/Pune
Read more
MNC Company - Product Based

MNC Company - Product Based

Agency job
via Bharat Headhunters by Ranjini C. N
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 9 yrs
₹10L - ₹15L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconPython
Google Cloud Platform (GCP)
+2 more

Job Responsibilities

  • Design, build & test ETL processes using Python & SQL for the corporate data warehouse
  • Inform, influence, support, and execute our product decisions
  • Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
  • Evaluate and prototype new technologies in the area of data processing
  • Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
  • High energy level, strong team player and good work ethic
  • Data analysis, understanding of business requirements and translation into logical pipelines & processes
  • Identification, analysis & resolution of production & development bugs
  • Support the release process including completing & reviewing documentation
  • Configure data mappings & transformations to orchestrate data integration & validation
  • Provide subject matter expertise
  • Document solutions, tools & processes
  • Create & support test plans with hands-on testing
  • Peer reviews of work developed by other data engineers within the team
  • Establish good working relationships & communication channels with relevant departments

 

Skills and Qualifications we look for

  • University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
  • 4 - 6 years experience with data engineering.
  • Strong coding ability and software development experience in Python.
  • Strong hands-on experience with SQL and Data Processing.
  • Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
  • Good working experience in any one of the ETL tools (Airflow would be preferable).
  • Should possess strong analytical and problem solving skills.
  • Good to have skills - Apache pyspark, CircleCI, Terraform
  • Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
  • Understanding & experience of agile / scrum delivery methodology

 

Read more
For an well reputed MNC  - as of its WFH

For an well reputed MNC - as of its WFH

Agency job
via Volibits by Hima Bindu
Bengaluru (Bangalore), Pune, Mumbai
4 - 8 yrs
₹2L - ₹15L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

Job Description:

 

Mandatory Skills:

Should have strong working experience with Cloud technologies like AWS and Azure.

Should have strong working experience with CI/CD tools like Jenkins and Rundeck.

Must have experience with configuration management tools like Ansible.

Must have working knowledge on tools like Terraform.

Must be good at Scripting Languages like shell scripting and python.

Should be expertise in DevOps practices and should have demonstrated the ability to apply that knowledge across diverse projects and teams.

 

Preferable skills:

Experience with tools like Docker, Kubernetes, Puppet, JIRA, gitlab and Jfrog.

Experience in scripting languages like groovy.

Experience with GCP

 

Summary & Responsibilities:

 Write build pipelines and IaaC (ARM templates, terraform or cloud formation).

 Develop ansible playbooks to install and configure various products.

 Implement Jenkins and Rundeck jobs( and pipelines).

 Must be a self-starter and be able to work well in a fast paced, dynamic environment

 Work independently and resolve issues with minimal supervision.

 Strong desire to learn new technologies and techniques

 Strong communication (written / verbal ) skills

 

Qualification:

Bachelor's degree in Computer Science or equivalent.

4+ years of experience in DevOps and AWS.

2+ years of experience in Python, Shell scripting and Azure.

 

 

Read more
Market Intelligence Research Platform

Market Intelligence Research Platform

Agency job
via Bohiyaanam by Greta R
Pune
2 - 5 yrs
₹4L - ₹15L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more

We are seeking a passionate DevOps Engineer to help create the next big thing in data analysis and search solutions.

You will join our Cloud infrastructure team supporting our developers . As a DevOps Engineer, you’ll be automating our environment setup and developing infrastructure as code to create a scalable, observable, fault-tolerant and secure environment. You’ll incorporate open source tools, automation, and Cloud Native solutions and will empower our developers with this knowledge. 

We will pair you up with world-class talent in cloud and software engineering and provide a position and environment for continuous learning.

Read more
Numerator

at Numerator

4 recruiters
Ketaki Kambale
Posted by Ketaki Kambale
Remote, Pune
5 - 10 yrs
₹10L - ₹30L / yr
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Windows Azure
+1 more
This role requires a balance between hands-on infrastructure-as-code deployments as well as involvement in operational architecture and technology advocacy initiatives across the Numerator portfolio.
 
Responsibilities
  • Selects, develops, and evaluates local personnel to ensure the efficient operation of the function

  • Leads and mentors local DevOps team members throughout the organisation..

  • Stays current with industry standards.

  • Work across engineering team to help define scope and task assignments

  • Participate in code reviews, design work and troubleshooting across business functions, multiple teams and product groups to help communicate, document and address infrastructure issues.

  • Look for innovative ways to improve observability, monitoring of large scale systems over a variety of technologies across the Numerator organization.

  • Participate in the creation of training material, helping teams embrace a culture of DevOps with self-healing and self-service ecosystems. This includes discovery, testing and integration of third party solutions in product roadmaps.

  • Lead by example and evangelize DevOps best practices within the team and within the organization and product teams in Numerator.

 

Technical Skills

  • 2+ years of experience in cloud-based systems, in a SRE or DevOps position

  • Professional and positive approach, self-motivated, strong in building relationships, team player, dynamic, creative with the ability to work on own initiatives.

  • Excellent oral and written communication skills.

  • Availability to participate in after-hours on-call support with your fellow engineers and help improve a team’s on-call process where necessary.

  • Strong analytical and problem solving mindset combined with experience of troubleshooting large-scale systems.

  • Working knowledge of networking, operating systems and packaging/build systems ie. AWS Linux, Ubuntu, PIP and NPM, Terraform, Ansible etc.

  • Strong working knowledge of Serverless and Kubernetes based environments in AWS, Azure and Google Cloud Platform (GCP).

  • Experience in managing highly redundant data stores, file systems and services both in the cloud and on-premise including both data transfer, redundancy and cost-management.

  • Ability to quickly stand up AWS or other cloud-based platform services in isolation or within product environments to test out a variety of solutions or concepts before developing production-ready solutions with the product teams.

  • Bachelors or, Masters in Science, or Post Doctorate in Computer Science or related field, or equivalent work experience.

Read more
Horizontal Integration
Remote, Bengaluru (Bangalore), Hyderabad, Vadodara, Pune, Jaipur, Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
6 - 15 yrs
₹10L - ₹25L / yr
skill iconAmazon Web Services (AWS)
Windows Azure
Microsoft Windows Azure
Google Cloud Platform (GCP)
skill iconDocker
+2 more

Position Summary

DevOps is a Department of Horizontal Digital, within which we have 3 different practices.

  1. Cloud Engineering
  2. Build and Release
  3. Managed Services

This opportunity is for Cloud Engineering role who also have some experience with Infrastructure migrations, this will be a complete hands-on job, with focus on migrating clients workloads to the cloud, reporting to the Solution Architect/Team Lead and along with that you are also expected to work on different projects for building out the Sitecore Infrastructure from scratch.

We are Sitecore Platinum Partner and majority of the Infrastructure work that we are doing is for Sitecore.

Sitecore is a .Net Based Enterprise level Web CMS, which can be deployed on On-Prem, IaaS, PaaS and Containers.

So, most of our DevOps work is currently planning, architecting and deploying infrastructure for Sitecore.
 

Key Responsibilities:

  • This role includes ownership of technical, commercial and service elements related to cloud migration and Infrastructure deployments.
  • Person who will be selected for this position will ensure high customer satisfaction delivering Infra and migration projects.
  • Candidate must expect to work in parallel across multiple projects, along with that candidate must also have a fully flexible approach to working hours.
  • Candidate should keep him/herself updated with the rapid technological advancements and developments that are taking place in the industry.
  • Along with that candidate should also have a know-how on Infrastructure as a code, Kubernetes, AKS/EKS, Terraform, Azure DevOps, CI/CD Pipelines.

Requirements:

  • Bachelor’s degree in computer science or equivalent qualification.
  • Total work experience of 6 to 8 Years.
  • Total migration experience of 4 to 6 Years.
  • Multiple Cloud Background (Azure/AWS/GCP)
  • Implementation knowledge of VMs, Vnet,
  • Know-how of Cloud Readiness and Assessment
  • Good Understanding of 6 R's of Migration.
  • Detailed understanding of the cloud offerings
  • Ability to Assess and perform discovery independently for any cloud migration.
  • Working Exp. on Containers and Kubernetes.
  • Good Knowledge of Azure Site Recovery/Azure Migrate/Cloud Endure
  • Understanding on vSphere and Hyper-V Virtualization.
  • Working experience with Active Directory.
  • Working experience with AWS Cloud formation/Terraform templates.
  • Working Experience of VPN/Express route/peering/Network Security Groups/Route Table/NAT Gateway, etc.
  • Experience of working with CI/CD tools like Octopus, Teamcity, Code Build, Code Deploy, Azure DevOps, GitHub action.
  • High Availability and Disaster Recovery Implementations, taking into the consideration of RTO and RPO aspects.
  • Candidates with AWS/Azure/GCP Certifications will be preferred.
Read more
Searce Inc

at Searce Inc

64 recruiters
Yashodatta Deshapnde
Posted by Yashodatta Deshapnde
Pune, Noida, Bengaluru (Bangalore), Mumbai, Chennai
3 - 10 yrs
₹5L - ₹20L / yr
DevOps
skill iconKubernetes
Google Cloud Platform (GCP)
Terraform
skill iconJenkins
+2 more
Role & Responsibilities :
• At least 4 years of hands-on experience with cloud infrastructure on GCP
• Hands-on-Experience on Kubernetes is a mandate
• Exposure to configuration management and orchestration tools at scale (e.g. Terraform, Ansible, Packer)
• Knowledge and hand-on-experience in DevOps tools (e.g. Jenkins, Groovy, and Gradle)
• Knowledge and hand-on-experience on the various platforms (e.g. Gitlab, CircleCl and Spinnakar)
• Familiarity with monitoring and alerting tools (e.g. CloudWatch, ELK stack, Prometheus)
• Proven ability to work independently or as an integral member of a team

Preferable Skills:
• Familiarity with standard IT security practices such as encryption,
credentials and key management.
• Proven experience on various coding languages (Java, Python-) to
• support DevOps operation and cloud transformation
• Familiarity and knowledge of the web standards (e.g. REST APIs, web security mechanisms)
• Hands on experience with GCP
• Experience in performance tuning, services outage management and troubleshooting.

Attributes:
• Good verbal and written communication skills
• Exceptional leadership, time management, and organizational skill Ability to operate independently and make decisions with little direct supervision
Read more
Yojito Software Private Limited
Tushar Khairnar
Posted by Tushar Khairnar
Pune
1 - 4 yrs
₹4L - ₹8L / yr
DevOps
skill iconDocker
skill iconKubernetes
skill iconPython
SQL
+4 more

We are looking for people with programming skills in Python, SQL, Cloud Computing. Candidate should have experience in at least one of the major cloud-computing platforms - AWS/Azure/GCP. He should professioanl experience in handling applications and databases in the cloud using VMs and Docker images. He should have ability to design and develop applications for the cloud.

 

You will be responsible for

  • Leading the DevOps strategy and development of SAAS Product Deployments
  • Leading and mentoring other computer programmers.
  • Evaluating student work and providing guidance in the online courses in programming and cloud computing.

 

Desired experience/skills

Qualifications: Graduate degree in Computer Science or related field, or equivalent experience.

 

Skills:

  • Strong programming skills in Python, SQL,
  • Cloud Computing

 

Experience:

2+ years of programming experience including Python, SQL, and Cloud Computing. Familiarity with command line working environment.

 

Note: A strong programming background, in any language and cloud computing platform is required. We are flexible about the degree of familiarity needed for the specific environments Python, SQL. If you have extensive experience in one of the cloud computing platforms and less in others you should still, consider applying.

 

Soft Skills:

  • Good interpersonal, written, and verbal communication skills; including the ability to explain the concepts to others.
  • A strong understanding of algorithms and data structures, and their performance characteristics.
  • Awareness of and sensitivity to the educational goals of a multicultural population would also be desirable.
  • Detail oriented and well organized.   
Read more
Zeni

at Zeni

2 recruiters
Parnita Pathak
Posted by Parnita Pathak
Pune
2 - 10 yrs
₹10L - ₹60L / yr
skill iconPython
RESTful APIs
skill iconJava
Google Cloud Platform (GCP)
Web Development
+1 more

We are looking for people that take quality as a point of pride. You will be a key member of the engineering staff working on our innovative FinTech product that simplifies the domain of finance management.

 

At Zeni.ai, we provide an AI-powered finance team with a real-time dashboard to manage all the finance functions for startups on one platform - bookkeeping, yearly taxes, bill pay & invoicing, financial projections & budgeting, employee reimbursements and more.  We are headquartered at Palo Alto, California plus engineering lab in Pune too.  The founders of Zeni are Snehal Shinde and Swapnil Shinde (Twins), they are serial entrepreneurs and Zeni is their third startup.  Before Zeni, they built Mezi.com that they sold to American Express at $120 million in merely two years.  Zeni is very well funded too and it can be disclosed when we talk.

 

The details about this position are as below:

Responsibilities:

  • You must be or like to be a Jack of all
  • Design and build fault-tolerant, high-performance, scalable systems
  • Design and maintain the core software components that support Zeni platform
  • Improve the scalability, resilience, observe ability, and efficiency of our core systems
  • Code using primarily Python.
  • Work closely with, and incorporate feedback from, product management, platform architects and senior engineers.
  • Fail fast, fix fast. Rapidly fix bugs and solve the problems
  • Proactively look for ways to make Zeni platform better
  • Speed, Speed, Speed - must be a performance freak!

Requirements:

  • E. / B.Tech in Computer Science.
  • 2yrs to 5 yrs of commercial software development experience
  • You have built some impressive, non-trivial web applications by hand
  • Excellent programming skills in Python (Object Oriented is a BIG plus)
  • Google App engine experience a huge plus
  • Disciplined approach to testing and quality assurance
  • Good understanding of web technologies (HTTP, Apache) and familiarity with Unix/Linux
  • Good understanding of data structures, algorithms and design patterns
  • Great written communication and documentation abilities
  • Comfortable in a small, intense and high-growth start-up environment
  • You know and can admit when something is not great.
  • You can recognise that something you've done needs improvement
  • Past participation in Hackathorns a big plus
  • Startup experience or Product company experience is MUST.
  • Experience integrating with 3rd party APIs
  • Experience with Agile product development methodology
  • Good at maintaining servers and troubleshooting
  • Understanding of database query processing and indexing are preferred
  • Experience with OAuth
  • Experience with Google Cloud and/or Google App Engine platforms
  • Experience writing unit tests
  • Experience with distributed version control systems (eg: Git)
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
3 - 8 yrs
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
Sibros

at Sibros

1 recruiter
Alisha Rodrigues
Posted by Alisha Rodrigues
Pune
6 - 15 yrs
₹10L - ₹50L / yr
Terraform
CI/CD
DevOps
IaC
Infrastructure as Code
+6 more

About the Role

  • Own the end-to-end infrastructure of Sibros Cloud
  • Define and introduce security best practices, identify gaps in infrastructure and come up with solutions
  • Design and implement tools and software to manage Sibros’ infrastructure
  • Stay hands-on, write and review code and documentation, debug and root cause issues in production environment

Minimum Qualifications

  • Experience in Infrastructure as Code (IaC) to manage multi-cloud environments using cloud agnostic tools like Terraform or Ansible
  • Passionate about security and have good understanding of industry best practices
  • Experience in programming languages like Python, Golang, and enjoying automating everything using code
  • Good skills and intuition on root cause issues in production environment

Preferred Qualifications

  • Experience in database and network management
  • Experience in defining security policies and best practices
  • Experience in managing a large scale multi cloud environment
  • Knowledge of SOC, GDPR or ISO 27001 security compliance standards is a plus

Equal Employment Opportunity

Sibros is committed to a policy of equal employment opportunity. We recruit, employ, train, compensate, and promote without regard to race, color, age, sex, ancestry, marital status, religion, national origin, disability, sexual orientation, veteran status, present or past history of mental disability, genetic information or any other classification protected by state or federal law.

Read more
Aureus Tech Systems

at Aureus Tech Systems

3 recruiters
Krishna Kanth
Posted by Krishna Kanth
Hyderabad, Bengaluru (Bangalore), Chennai, Visakhapatnam, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
6 - 14 yrs
₹18L - ₹25L / yr
skill icon.NET
skill iconC#
ASP.NET
Web API
LINQ
+3 more

Title : .Net Developer with Cloud 

Locations: Hyderabad, Chennai, Bangalore, Pune and new Delhi (Remote).

Job Type: Full Time


.Net Job Description:

Required experience on below skills:

Azure experienced (Mandatory)
.Net programming (Mandatory)
DevSecOps capabilities (Desired)
Scripting skills (Desired)
Docker (Desired)
Data lake management (Desired)
  . Minimum of 5+ years application development experience 

. Experience with MS Azure: App Service, Functions, Cosmos DB and Active Directory

· Deep understanding of C#, .NET Core, ASP.NET Web API 2, MVC

· Experience with MS SQL Server

· Strong understanding of object-oriented programming

· Experience working in an Agile environment.

· Strong understanding of code versioning tools such as Git or Subversion

· Usage of automated build and/or unit testing and continuous integration systems

· Excellent communication, presentation, influencing, and reasoning skills.

· Capable of building relationships with colleagues and key individuals.

. Must have capability of learning new technologies.

Edited
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune, Hyderabad
3 - 12 yrs
₹5L - ₹25L / yr
Apache Kafka
Big Data
Hadoop
Apache Hive
skill iconJava
+1 more

Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.

 

Must Have Skills

  • Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
  • Leading in the identification, isolation, resolution and communication of problems within the production environment.
  • Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
  • Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
  • Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
  • Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
  • Works on multiple platforms and multiple projects concurrently.
  • Performs code and unit testing for complex scope modules, and projects
  • Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
  • Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
  • Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
  • Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector,  JMS source connectors, Tasks, Workers, converters, Transforms.
  • Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
  • Working knowledge on Kafka Rest proxy.
  • Ensure optimum performance, high availability and stability of solutions.
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
  • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.  Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem. 
  • Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
  • Ability to perform data related benchmarking, performance analysis and tuning.
  • Strong skills in In-memory applications, Database Design, Data Integration.
Read more
Channel 3

at Channel 3

1 recruiter
HR Shubhangi
Posted by HR Shubhangi
Pune, Bengaluru (Bangalore)
3 - 10 yrs
₹5L - ₹12L / yr
Technical Architecture
Solution architecture
Information architecture
Architecture
PowerBI
+6 more
Solution Architect/ DesignerYb
Data warehousing architect/ desinger
Data migration architect/ designer
Read more
Consulting and Product Engineering Company

Consulting and Product Engineering Company

Agency job
via Exploro Solutions by Sapna Prabhudesai
Hyderabad, Bengaluru (Bangalore), Pune, Chennai
8 - 12 yrs
₹7L - ₹30L / yr
DevOps
Terraform
skill iconDocker
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
+1 more

Job Dsecription:

 

○ Develop best practices for team and also responsible for the architecture

○ solutions and documentation operations in order to meet the engineering departments quality and standards

○ Participate in production outage and handle complex issues and works towards Resolution

○ Develop custom tools and integration with existing tools to increase engineering Productivity

 

 

Required Experience and Expertise

 

○ Having a good knowledge of Terraform + someone who has worked on large TF code bases.

○ Deep understanding of Terraform with best practices & writing TF modules.

○ Hands-on experience of GCP  and AWS and knowledge on AWS Services like VPC and VPC related services like (route tables, vpc endpoints, privatelinks) EKS, S3, IAM. Cost aware mindset towards Cloud services.

○ Deep understanding of Kernel, Networking and OS fundamentals

NOTICE PERIOD - Max - 30 days

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
12 - 20 yrs
₹20L - ₹35L / yr
Data Warehouse (DWH)
ETL
Big Data
Business Intelligence (BI)
Project Management
+1 more

Job Description

Experience : 10+ Years

Location : Pune


Job Requirements:

  • Minimum of 10+ years of experience with a proven record of increased responsibility
  • Hands on experience in design, development and managing Big Data, Cloud, Data warehousing
  • and Business Intelligence projects
  • Experience of managing projects in Big Data, Cloud, Data warehousing, Business Intelligence
  • Using open source or top of the line tools and technologies
  • Good knowledge of Dimensional Modeling
  • Experience of working with any ETL and BI Reporting tools
  • Experience of managing medium to large projects, preferably on Big Data
  • Proven experience in project planning, estimation, execution and implementation of medium to
  • large projects
  • Should be able to effectively communicate in English
  • Strong management and leadership skills, with proven ability to develop and manage client
  • relationships
  • Proven problem-solving skills from both technical and managerial perspectives
  • Attention to detail and a commitment to excellence and high standards
  • Excellent interpersonal and communication skills, both verbal and written
  • Position is remote with occasional travel to other offices, client sites, conventions, training
  • locations, etc.
  • Bachelor’s degree in Computer Science, Business\Economics, or a related field or demonstrated,
  • equivalent/practical knowledge or experience

Job Responsibilities:

  • Day to day project management, scrum and agile management including project planning, delivery
  • and execution of Big Data and
  • Primary Point of contact for customer related to all project engagements, delivery and project
  • escalations
  • Design right architecture and technology stack depending on business requirement on Cloud / Big
  • Data and BI related technologies both some on-premise and on cloud
  • Liaise with key stakeholders to define the Cloud / Big data solutions roadmap, prioritize the
  • deliverables
  • Responsible for end to end project delivery of Cloud / Big Data Solutions from project estimations,
  • project planning, resourcing and monitoring perspective
  • Drive and participate in requirements gathering workshops, estimation discussions, design
  • meetings and status review meetings
  • Support & assist the team in resolving issues during testing and when the system is in production
  • Involved in the full customer lifecycle with a goal to make customers successful and increase
  • revenue and retention
  • Interface with the offshore engineering team to solve customer issues
  • Develop programs that meet customer needs with respect to functionality, performance,
  • scalability, reliability, schedule, principles and recognized industry standards
  • Requirement analysis and documentation
  • Manage day-to-day operational aspects of a project and scope
  • Prepare for engagement reviews and quality assurance procedures
  • Visit and/or host clients to strengthen business relationships
Read more
Fast paced Startup

Fast paced Startup

Agency job
via Kavayah People Consulting by Kavita Singh
Pune
3 - 6 yrs
₹15L - ₹22L / yr
Big Data
Data engineering
Hadoop
Spark
Apache Hive
+6 more

ears of Exp: 3-6+ Years 
Skills: Scala, Python, Hive, Airflow, Spark

Languages: Java, Python, Shell Scripting

GCP: BigTable, DataProc,  BigQuery, GCS, Pubsub

OR
AWS: Athena, Glue, EMR, S3, Redshift

MongoDB, MySQL, Kafka

Platforms: Cloudera / Hortonworks
AdTech domain experience is a plus.
Job Type - Full Time 

Read more
Searce Inc

at Searce Inc

64 recruiters
Mishita Juneja
Posted by Mishita Juneja
Pune
3 - 6 yrs
₹8L - ₹14L / yr
Project Management
Project manager
Project coordination
Team building
Team Management
+8 more

Project Manager

Who we are?

Searce is  a niche’ Cloud Consulting business with futuristic tech DNA. We do new-age tech to realise the “Next” in the “Now” for our Clients. We specialise in Cloud Data Engineering, AI/Machine Learning and Advanced Cloud infra tech such as Anthos and Kubernetes. We are one of the top & the fastest growing partners for Google Cloud and AWS globally with over 2,500 clients successfully moved to cloud.

What we believe?

  • Best practices are overrated
      • Implementing best practices can only make one n .
  • Honesty and Transparency
      • We believe in naked truth. We do what we tell and tell what we do.
  • Client Partnership
    • Client - Vendor relationship: No. We partner with clients instead. 
    • And our sales team comprises 100% of our clients.

How we work?

It’s all about being Happier first. And rest follows. Searce work culture is defined by HAPPIER.

  • Humble: Happy people don’t carry ego around. We listen to understand; not to respond.
  • Adaptable: We are comfortable with uncertainty. And we accept changes well. As that’s what life's about.
  • Positive: We are super positive about work & life in general. We love to forget and forgive. We don’t hold grudges. We don’t have time or adequate space for it.
  • Passionate: We are as passionate about the great street-food vendor across the street as about Tesla’s new model and so on. Passion is what drives us to work and makes us deliver the quality we deliver.
  • Innovative: Innovate or Die. We love to challenge the status quo.
  • Experimental: We encourage curiosity & making mistakes.
  • Responsible: Driven. Self motivated. Self governing teams. We own it.

Are you the one? Quick self-discovery test:

  1. Love for cloud: When was the last time your dinner entailed an act on “How would ‘Jerry Seinfeld’ pitch Cloud platform & products to this prospect” and your friend did the ‘Sheldon’ version of the same thing.
  2. Passion for sales: When was the last time you went at a remote gas station while on vacation, and ended up helping the gas station owner saasify his 7 gas stations across other geographies.
  3. Compassion for customers: You listen more than you speak.  When you do speak, people feel the need to listen.
  4. Humor for life: When was the last time you told a concerned CEO, ‘If Elon Musk can attempt to take humanity to Mars, why can’t we take your business to run on cloud?’

Do you cloud?

  1. Expertise driving infrastructure engineering projects & transform initiatives from deep-tech startups to 42,000U (1,000 42U racks) loaded enterprises.
  2. Have you tandem-jumped at least 50 times taking along a trusted business having on-prem workloads to safe-land onto a public cloud with a soft touchdown?
    1. Do you understand the innards of Apache web servers on Linux and Sharepoint Server Farms alike?
    2. Do you speak CloudFormation? Terraform? Do you speak JSON? 
    3. Do you love automating everything possible leveraging Python / Powershell / Bash?
  3. Are you a voracious reader fascinated by the latest & greatest innovations on public cloud?

Introduction

We welcome *really unconventional* creative thinkers who can work in an agile, flexible environment. We are a flat organization with unlimited growth opportunities, and small team sizes – wherein flexibility is a must, mistakes are encouraged, creativity is rewarded, and excitement is required.

  1. This is an entrepreneurial project management position that challenges the client status quo by helping them re-imagine the art of what is possible using consulting mindset and business strategy skills.
  2. This position requires fanatic iterative improvement ability - ability to coordinate with Sales - Presales - Delivery - MS and get project delivery done. 
  3. This position is for hard-core-geek-turned--engineer-turned-Tech enthusiast -turned-Project Manager.

What we are NOT looking for: Buzzword Bozos (BB) or Certification Chasers (CC).

What we seek is an AA (Awesome Attitude to Learn, Improve & Coach).

Not just a BB or CC? Quick self-discovery test:

When was the last time you thought about how GAN (Generative Adversarial Networks) can help auto-create new building designs for an architect or new jewellery designs for a designer OR passionately convinced a friend that Google Video Intelligence API or AWS Rekognition can now auto-video-screen an applicant with 90+% confidence level almost automating 75+% efforts of a recruiter, OR leverage Google Vision API to do e-KYC? If this is what you constantly get blamed for & your friends have made a strip for you on xkcd, you are probably the right-fit for what we are looking for.

Your bucket of Undertaking :

This position will be responsible to consult with the clients and propose architectural solutions to help move & improve infra from on-premise to cloud or help optimize cloud spend from one public cloud to the other.

  1. Be the first one to experiment on new age cloud offerings, help define the best practise as a thought leader for cloud, automation & Dev-Ops, be a solution visionary and technology expert across multiple channels.
  2. Continually augment skills and learn new tech as the technology and client needs evolve
  3. Use your experience in Google cloud platform, AWS  or Microsoft Azure to build hybrid-cloud solutions for customers.
  4. Provide leadership to project teams, and facilitates the definition of project deliverables around core Cloud based technology and methods.
  5. Define tracking mechanisms and ensure IT standards and methodology are met; deliver quality results.
  6. Participate in technical reviews of requirements, designs, code and other artifacts
  7. Identify and keep abreast of new technical concepts in google cloud platform

Education, Experience, etc.

  1. Is Education overrated? Yes. We believe so. However there is no way to locate you otherwise. So unfortunately we might have to look for a Bachelor's or Master's degree in engineering from a reputed institute or you should be programming from 12. And the latter is better. We will find you faster if you specify the latter in some manner. Not just degree, but we are not too thrilled by tech certifications too ... :)
  1. To reiterate: Passion to tech-awesome, insatiable desire to learn the latest of the new-age cloud tech, highly analytical aptitude and a strong ‘desire to deliver’ outlives those fancy degrees!
  2. 3 - 5 years of experience with at least 1 - 2 years of hands-on experience in delivering Cloud Consulting(AWS/GCP/Azure) projects in a global enterprise environment.
  3. Good analytical, communication, problem solving, and learning skills.
  4. Knowledge on programming against cloud platforms such as Google Cloud Platform and lean development methodologies.

A quick self-discovery test below:

  1. How you treat yourself & others?
    1. You listen more than you speak.  When you do speak, people feel the need to listen.
    2. You have ‘one’ life - no work life or personal life. You are the same at both places.
    3. You are generally happy and passionate about life. When shit does happen you know how to tell your heart ‘All is well’.
    4. You are compassionate to yourself, you love your work, your company, your country, and are generally a person people like to be around.
  2. How you work & live?
    1. You make difficult & complex decisions in an environment filled with lack of well defined constraints and uncertainty.
    2. You are able to admit to your team that you were shit scared while making those decisions.
    3. You are able to juggle conflicting priorities and remain composed as the client keeps on changing requirements. :)
    4. You are genuinely passionate about developing great software, learning a lot, helping others learn and  having loads of fun while doing so.
  3. What you love?
    1. You love things. You are passionate. You care for your self, family, country and Big Bang Theory (and this is a must!).
    2. You love to organize, index, and improve things around you - Yes you are Sheldon’ish’ at times and ‘Leonard’ish’ the other times.
    3. You are passionate about improving processes and you truly feel satisfied by making things better.
    4. You love Google. And AWS. And Terraform. And CI - CD pipelines. And linux.

 

Read more
Searce Inc

at Searce Inc

64 recruiters
Mishita Juneja
Posted by Mishita Juneja
Pune
3 - 6 yrs
₹8L - ₹14L / yr
DevOps
skill iconKubernetes
skill iconDocker
Terraform
Cloud Computing
+11 more

Senior Devops Engineer



Who are we?

Searce is  a niche’ Cloud Consulting business with futuristic tech DNA. We do new-age tech to realise the “Next” in the “Now” for our Clients. We specialise in Cloud Data Engineering, AI/Machine Learning and Advanced Cloud infra tech such as Anthos and Kubernetes. We are one of the top & the fastest growing partners for Google Cloud and AWS globally with over 2,500 clients successfully moved to cloud.

What do we believe?

  • Best practices are overrated
      • Implementing best practices can only make one n ‘average’ .
  • Honesty and Transparency
      • We believe in naked truth. We do what we tell and tell what we do.
  • Client Partnership
    • Client - Vendor relationship: No. We partner with clients instead. 
    • And our sales team comprises 100% of our clients.

How do we work?

It’s all about being Happier first. And rest follows. Searce work culture is defined by HAPPIER.

  • Humble: Happy people don’t carry ego around. We listen to understand; not to respond.
  • Adaptable: We are comfortable with uncertainty. And we accept changes well. As that’s what life's about.
  • Positive: We are super positive about work & life in general. We love to forget and forgive. We don’t hold grudges. We don’t have time or adequate space for it.
  • Passionate: We are as passionate about the great street-food vendor across the street as about Tesla’s new model and so on. Passion is what drives us to work and makes us deliver the quality we deliver.
  • Innovative: Innovate or Die. We love to challenge the status quo.
  • Experimental: We encourage curiosity & making mistakes.
  • Responsible: Driven. Self motivated. Self governing teams. We own it.

Are you the one? Quick self-discovery test:

  1. Love for cloud: When was the last time your dinner entailed an act on “How would ‘Jerry Seinfeld’ pitch Cloud platform & products to this prospect” and your friend did the ‘Sheldon’ version of the same thing.
  2. Passion for sales: When was the last time you went at a remote gas station while on vacation, and ended up helping the gas station owner saasify his 7 gas stations across other geographies.
  3. Compassion for customers: You listen more than you speak.  When you do speak, people feel the need to listen.
  4. Humor for life: When was the last time you told a concerned CEO, ‘If Elon Musk can attempt to take humanity to Mars, why can’t we take your business to run on cloud ?

Introduction

When was the last time you thought about rebuilding your smart phone charger using solar panels on your backpack OR changed the sequencing of switches in your bedroom (on your own, of course) to make it more meaningful OR pointed out an engineering flaw in the sequencing of traffic signal lights to a fellow passenger, while he gave you a blank look? If the last time this happened was more than 6 months ago, you are a dinosaur for our needs. If it was less than 6 months ago, did you act on it? If yes, then let’s talk.

We are quite keen to meet you if:

  • You eat, dream, sleep and play with Cloud Data Store & engineering your processes on cloud architecture
  • You have an insatiable thirst for exploring improvements, optimizing processes, and motivating people.
  • You like experimenting, taking risks and thinking big.

3 things this position is NOT about:

  1. This is NOT just a job; this is a passionate hobby for the right kind.
  2. This is NOT a boxed position. You will code, clean, test, build and recruit & energize.
  3. This is NOT a position for someone who likes to be told what needs to be done.

3 things this position IS about:

  1. Attention to detail matters.
  2. Roles, titles, ego does not matter; getting things done matters; getting things done quicker & better matters the most.
  3. Are you passionate about learning new domains & architecting solutions that could save a company millions of dollars?

Roles and Responsibilities

This is an entrepreneurial Cloud/DevOps Lead position that evolves to the Director- Cloud engineering .This position requires fanatic iterative improvement ability - architect a solution, code, research, understand customer needs, research more, rebuild and re-architect, you get the drift. We are seeking hard-core-geeks-turned-successful-techies who are interested in seeing their work used by millions of users the world over.


Responsibilities:

  • Consistently strive to acquire new skills on Cloud, DevOps, Big Data, AI and ML technologies
  • Design, deploy and maintain Cloud infrastructure for Clients – Domestic & International
  • Develop tools and automation to make platform operations more efficient, reliable and reproducible
  • Create Container Orchestration (Kubernetes, Docker), strive for full automated solutions, ensure the up-time and security of all cloud platform systems and infrastructure
  • Stay up to date on relevant technologies, plug into user groups, and ensure our client are using the best techniques and tools
  • Providing business, application, and technology consulting in feasibility discussions with technology team members, customers and business partners
  • Take initiatives to lead, drive and solve during challenging scenarios

Requirements:

  • 3 + Years of experience in Cloud Infrastructure and Operations domains
  • Experience with Linux systems, RHEL/CentOS preferred
  • Specialize in one or two cloud deployment platforms: AWS, GCP, Azure
  • Hands on experience with AWS services (EC2, VPC, RDS, DynamoDB, Lambda)
  • Experience with one or more programming languages (Python, JavaScript, Ruby, Java, .Net)
  • Good understanding of Apache Web Server, Nginx, MySQL, MongoDB, Nagios
  • Knowledge on Configuration Management tools such as Ansible, Terraform, Puppet, Chef
  • Experience working with deployment and orchestration technologies (such as Docker, Kubernetes, Mesos)
  • Deep experience in customer facing roles with a proven track record of effective verbal and written communications
  • Dependable and good team player
  • Desire to learn and work with new technologies

Key Success Factors

  • Are you
    • Likely to forget to eat, drink or pee when you are coding?
    • Willing to learn, re-learn, research, break, fix, build, re-build and deliver awesome code to solve real business/consumer needs?
    • An open source enthusiast?
  • Absolutely technology agnostic and believe that business processes define and dictate which technology to use?
  • Ability to think on your feet, and follow-up with multiple stakeholders to get things done
  • Excellent interpersonal communication skills
  • Superior project management and organizational skills
  • Logical thought process; ability to grasp customer requirements rapidly and translate the same into technical as well as layperson terms
  • Ability to anticipate potential problems, determine and implement solutions
  • Energetic, disciplined, with a results-oriented approach
  • Strong ethics and transparency in dealings with clients, vendors, colleagues and partners
  • Attitude of ‘give me 5 sharp freshers and 6 months and I will rebuild the way people communicate over the internet.
  • You are customer-centric, and feel strongly about building scalable, secure, quality software. You thrive and succeed in delivering high quality technology products in a growth environment where priorities shift fast. 
Read more
Healrthcare MNC

Healrthcare MNC

Agency job
via Kavayah People Consulting by Kavita Singh
Pune
7 - 13 yrs
₹20L - ₹50L / yr
skill iconData Science
skill iconR Programming
skill iconPython
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more
Position summary
As an experienced Data Scientist you’ll join a team of data scientists, analysts, and software engineers
working to push the boundaries of data science in health care. We like to experiment, iterate, and
innovate with technology, from developing new algorithms specific to health care’s challenges, to
bringing the latest machine learning practices and applications developed in other industries into the
health care world. We know that algorithms are only valuable when powered by the right data, so we
focus on fully understanding the problems we need to solve, and truly understanding the data behind
them before launching into solutions – ensuring that the solutions we do land on are impactful and
powerful

Essential functions
• Research, conceptualize, and implement analytical approaches and predictive modeling to
evaluate scenarios, predict utilization and clinical outcomes, and recommend actions to impact
results.
• Manage and execute on the entire model development process, including scope definition,
hypothesis formation, data cleaning and preparation, feature selection, model implementation
in production, validation and iteration, using multiple data sources.
• Provide guidance on necessary data and software infrastructure capabilities to deliver a scalable
solution across partners and support the implementation of the team’s algorithms and models

• Contribute to the development and publication in major journals, conferences showcasing
 leadership in healthcare data science.
• Work closely and collaborate with Data Scientists, Machine Learning engineers, IT teams and
Business stakeholders spread out across various locations in US and India to achieve business
goals
• Provide guidance to other Data Scientist and Machine Learning Engineers
Read more
Espressif Systems India Pvt Ltd
Anuja Pawar
Posted by Anuja Pawar
Pune
4 - 13 yrs
₹12L - ₹35L / yr
skill iconJava
skill iconGo Programming (Golang)
skill iconPython
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
+1 more

About Company

Espressif Systems (688018) is a public multinational, fabless semiconductor company established in 2008, with headquarters in Shanghai and offices in Greater China, India, and Europe. We have a passionate team of engineers and scientists from all over the world, focused on developing cutting-edge WiFi-and-Bluetooth, low-power IoT solutions. We have created the popular ESP8266 and ESP32 series of chips, modules, and development boards. By leveraging wireless computing, we provide green, versatile, and cost-effective chipsets. We have always been committed to offering IoT solutions that are secure, robust, and power-efficient. By open-sourcing our technology, we aim to enable developers to use Espressif’s technology globally and build smart connected devices. In July 2019, Espressif made its Initial Public Offering on the Sci-Tech Innovation Board (STAR) of the Shanghai Stock Exchange (SSE).

Espressif has a technology center in Pune. The focus is on embedded software engineering and IoT solutions for our growing customers.


About the Role

Espressif’s https://rainmaker.espressif.com/ is a paradigm-shifting IoT cloud platform that provides seamless connectivity to IoT devices to mobile apps, voice assistants, and other services. It is designed with scalability, security, reliability, and operational cost at the center. We are looking for senior cloud engineers who can significantly contribute to this platform by means of architecture, design, and implementation. It’s highly desirable that the candidate has earlier experience of working on large-scale cloud product development and understand the responsibilities and challenges well. Strong hands-on experience in writing code in Go, Java, or Python is a must.

This is an individual contributor role.


Minimum Qualifications

  • BE/B.Tech in Computer Science with 5-10 years of experience.

  • Strong Computer Science Fundamentals.

  • Extensive programming experience in one of these programming languages ( Java, Go, Python) is a must.

  • Good working experience of any of the Cloud Platforms - AWS, Azure, Google Cloud Platform.

  • Certification in any of these cloud platforms will be an added advantage.

  • Good Experience in the development of RESTful APIs, handling the security and

    performance aspects.

  • Strong debugging and troubleshooting skills.

  • Experience working with RDBMS or any NoSQL database like DynamoDB, MYSQL, Oracle.

  • Working knowledge about CI/CD tools - Maven/Gradle, Jenkins, experience in a Linux (or Unix) based environment.


Desired Qualifications


  • Exposure to Serverless computing frameworks like AWS Lambda, Google Cloud Functions, Azure Functions

  • Some Exposure to front end development tools - HTML5, CSS, Javascript, React.js/Anular.js

  • Working knowledge on Docker, Jenkins.
    Prior experience working in the IoT domain will be an added advantage.

What to expect from our interview process


  • The first step is to email your resume or apply to the relevant open position, along with a sample of something you have worked on such as a public GitHub repo or side project, etc.

  • Next, post shortlisting your profile recruiter will get in touch with you via a mechanism that works for you e.g. via email, phone. This will be a short chat to learn more about your background and interests, to share more about the job and Espressif, and to answer any initial questions you have.

  • Successful candidates will then be invited for 2 to 3 rounds of the technical interviews as per the previous round feedback.

  • Finally, Successful candidates will have interviews with HR. What you offer us

  • Ability to provide technical solutions, support that fosters collaboration and innovation.
    Ability to balance a variety of technical needs and priorities according to Espressif’s growing needs.

What we offer


  • An open-minded, collaborative culture of enthusiastic technologists.
  • Competitive salary
  • 100% company paid medical/dental/vision/life coverage
  • Frequent training by experienced colleagues and chances to take international trips, attend exhibitions, technical meetups, and seminars.
 
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort