Cutshort logo
AWS (Amazon Web Services) Jobs in Bangalore (Bengaluru)

50+ AWS (Amazon Web Services) Jobs in Bangalore (Bengaluru) | AWS (Amazon Web Services) Job openings in Bangalore (Bengaluru)

Apply to 50+ AWS (Amazon Web Services) Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest AWS (Amazon Web Services) Job opportunities across top companies like Google, Amazon & Adobe.

icon
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune, Bengaluru (Bangalore), Hyderabad
7 - 12 yrs
₹25L - ₹30L / yr
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
Migration
skill iconPython
AWS services
+3 more

Job Title: Senior Node.js and Python Azure developer ( AWS to Azure Migration expert)

 

Experience: 7-10 Yrs.

 

Primary Skills:

 

Node.js and Python

 

Hands-on experience with Azure, Serverless (Azure Functions)

 

AWS to Azure Cloud Migration (Preferred)

 

 Scope of Work:

 

  • Hand-on experience in migration of Node.js and Python application from AWS to Azure environment
  •  
  • Analyse source architecture, Source code and AWS service dependencies to identify code remediations scenarios.
  •  
  • Perform code remediations/Refactoring and configuration changes required to deploy the application on Azure, including Azure service dependencies and other application dependencies remediations at source code. 
  •  
  • 7+ years of experience in application development with Node.js and Python
  •  
  • Experience in Unit testing, application testing support and troubleshooting on Azure. 
  •  
  • Experience in application deployment scripts/pipelines, App service, APIM, AKS/Microservices/containerized apps, Kubernetes, helm charts. 
  •  
  • Hands-on experience in developing apps for AWS and Azure (Must Have)
  •  
  • Hands-on experience with Azure services for application development (AKS, Azure Functions) and deployments. 
  •  
  • Understanding of Azure infrastructure services required for hosting applications on Azure PaaS or Serverless. 
  •  
  •  Tech stack details:
  •  
  • Confluent Kafka AWS S3 Sync connector
  •  
  • Azure Blob Storage
  •  
  • AWS lambda to Azure Functions (Serverless) – Python or Node.js
  •  
  • NodeJS REST API
  •  
  • S3 to Azure Blob Storage
  •  
  • AWS to Azure SDK Conversion (Must Have)

 

 

Educational qualification:

 

B.E/B.Tech/MCA

 


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore)
6 - 9 yrs
₹8L - ₹20L / yr
skill iconJava
06692
Microservices
skill iconAmazon Web Services (AWS)
skill iconDocker
+1 more

Role: Sr. Java Developer

Experience: 6+ Years

Location: Bangalore (Whitefield)

Work Mode: Hybrid (2-3 days WFO)

Shift Timing: Regular Morning Shift


About the Role:

We are looking for a seasoned Java Developer with 6+ years of experience to join our growing engineering team. The ideal candidate should have strong expertise in Java, Spring Boot, Microservices, and cloud-based deployment using AWS or DevOps tools. This is a hybrid role based out of our Whitefield, Bangalore location.


Key Responsibilities:

  • Participate in agile development processes and scrum ceremonies.
  • Translate business requirements into scalable and maintainable technical solutions.
  • Design and develop applications using Java, Spring Boot, and Microservices architecture.
  • Ensure robust and reliable code through full-scale unit testing and TDD/BDD practices.
  • Contribute to CI/CD pipeline setup and cloud deployments.
  • Work independently and as an individual contributor on complex features.
  • Troubleshoot production issues and optimize application performance.


Mandatory Skills:

  • Strong Core Java and Spring Boot expertise.
  • Proficiency in AWS or DevOps (Docker & Kubernetes).
  • Experience with relational and/or non-relational databases (SQL, NoSQL).
  • Sound understanding of Microservices architecture and RESTful APIs.
  • Containerization experience using Docker and orchestration via Kubernetes.
  • Familiarity with Linux-based development environments.
  • Exposure to modern SDLC tools – Maven, Git, Jenkins, etc.
  • Good understanding of CI/CD pipelines and cloud-based deployment.


Soft Skills:

  • Self-driven, proactive, and an individual contributor.
  • Strong problem-solving and analytical skills.
  • Excellent communication and interpersonal abilities.
  • Able to plan, prioritize, and manage tasks independently.


Nice-to-Have Skills:

  • Exposure to frontend technologies like Angular, JavaScript, HTML5, and CSS.


Read more
appscrip

at appscrip

2 recruiters
Nilam Surti
Posted by Nilam Surti
Bengaluru (Bangalore)
0 - 0 yrs
₹3L - ₹4L / yr
DevOps
skill iconKubernetes
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Terraform

Looking for Fresher developers


Responsibilities:

  • Implement integrations requested by customers
  • Deploy updates and fixes
  • Provide Level 2 technical support
  • Build tools to reduce occurrences of errors and improve customer experience
  • Develop software to integrate with internal back-end systems
  • Perform root cause analysis for production errors
  • Investigate and resolve technical issues
  • Develop scripts to automate visualization
  • Design procedures for system troubleshooting and maintenance


Requirements and skill:

Knowledge in DevOps Engineer or similar software engineering role

Good knowledge of Terraform, Kubernetes

Working knowledge on AWS, Google Cloud



You can directly contact me on nine three one six one two zero one three two

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Zenius IT Services Pvt Ltd
Bengaluru (Bangalore), Chennai, Hyderabad
4 - 9 yrs
₹6L - ₹18L / yr
Apache Kafka
DynamoDB
skill iconRedis
skill iconAmazon Web Services (AWS)
Windows Azure
+7 more

About the Role

We are looking for a skilled Backend Engineer with strong experience in building scalable microservices, integrating with distributed data systems, and deploying web APIs that serve UI applications in the cloud. You’ll work on high-performance systems involving Kafka, DynamoDB, Redis, and other modern backend technologies.


Responsibilities

  • Design, develop, and deploy backend microservices and APIs that power UI applications.
  • Implement event-driven architectures using Apache Kafka or similar messaging platforms.
  • Build scalable and highly available systems using NoSQL databases (e.g., DynamoDB, MongoDB).
  • Optimize backend systems using caching layers like Redis to enhance performance.
  • Ensure seamless deployment and operation of services in cloud environments (AWS, GCP, or Azure).
  • Write clean, maintainable, and well-tested code; contribute to code reviews and architecture discussions.
  • Collaborate closely with frontend, DevOps, and product teams to deliver integrated solutions.
  • Monitor and troubleshoot production issues and participate in on-call rotations as needed.


Required Qualifications

  • 3–7 years of professional experience in backend development.
  • Strong programming skills in one or more languages: Java, Python, Go, Node.js.
  • Hands-on experience with microservices architecture and API design (REST/gRPC).
  • Practical experience with Kafka, RabbitMQ, or other event streaming/message queue systems.
  • Solid knowledge of NoSQL databases, especially DynamoDB or equivalents.
  • Experience using Redis or Memcached for caching or pub/sub mechanisms.
  • Proficiency with cloud platforms (preferably AWS – e.g., Lambda, ECS, EKS, API Gateway).
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Hyderabad
5 - 10 yrs
₹10L - ₹18L / yr
skill iconData Analytics
SQL
databricks
skill iconAmazon Web Services (AWS)
Windows Azure
+4 more

Position : Senior Data Analyst

Experience Required : 5 to 8 Years

Location : Hyderabad or Bangalore (Work Mode: Hybrid – 3 Days WFO)

Shift Timing : 11:00 AM – 8:00 PM IST

Notice Period : Immediate Joiners Only


Job Summary :

We are seeking a highly analytical and experienced Senior Data Analyst to lead complex data-driven initiatives that influence key business decisions.

The ideal candidate will have a strong foundation in data analytics, cloud platforms, and BI tools, along with the ability to communicate findings effectively across cross-functional teams. This role also involves mentoring junior analysts and collaborating closely with business and tech teams.


Key Responsibilities :

  • Lead the design, execution, and delivery of advanced data analysis projects.
  • Collaborate with stakeholders to identify KPIs, define requirements, and develop actionable insights.
  • Create and maintain interactive dashboards, reports, and visualizations.
  • Perform root cause analysis and uncover meaningful patterns from large datasets.
  • Present analytical findings to senior leaders and non-technical audiences.
  • Maintain data integrity, quality, and governance in all reporting and analytics solutions.
  • Mentor junior analysts and support their professional development.
  • Coordinate with data engineering and IT teams to optimize data pipelines and infrastructure.

Must-Have Skills :

  • Strong proficiency in SQL and Databricks
  • Hands-on experience with cloud data platforms (AWS, Azure, or GCP)
  • Sound understanding of data warehousing concepts and BI best practices

Good-to-Have :

  • Experience with AWS
  • Exposure to machine learning and predictive analytics
  • Industry-specific analytics experience (preferred but not mandatory)
Read more
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
1 - 4 yrs
₹6L - ₹10L / yr
skill iconNodeJS (Node.js)
skill iconMongoDB
Mongoose
skill iconExpress
skill iconAmazon Web Services (AWS)
+6 more

Job Title: Backend Developer

Location: In-Office, Bangalore, Karnataka, India


Job Summary:

We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.


Annual Compensation: 6-10 LPA


Responsibilities:

  • Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
  • Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and contribute to architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.


Requirements:

  • Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
  • Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
  • Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
  • Practical experience with Redis and caching mechanisms to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.


Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹12L / yr (Varies
)
DevOps
skill iconAmazon Web Services (AWS)
CI/CD
skill iconDocker
skill iconKubernetes
+3 more

Role Overview

We are seeking a DevOps Engineer with 2 years of experience to join our innovative team. The ideal

candidate will bridge the gap between development and operations, implementing and maintaining our

cloud infrastructure while ensuring secure deployment pipelines and robust security practices for our

client projects.


Responsibilities:

  • Design, implement, and maintain CI/CD pipelines.
  • Containerize applications using Docker and orchestrate deployments
  • Manage and optimize cloud infrastructure on AWS and Azure platforms
  • Monitor system performance and implement automation for operational tasks to ensure optimal
  • performance, security, and scalability.
  • Troubleshoot and resolve infrastructure and deployment issues
  • Create and maintain documentation for processes and configurations
  • Collaborate with cross-functional teams to gather requirements, prioritise tasks, and contribute to project completion.
  • Stay informed about emerging technologies and best practices within the fields of DevOps and cloud computing.


Requirements:

  • 2+ years of hands-on experience with AWS cloud services
  • Strong proficiency in CI/CD pipeline configuration
  • Expertise in Docker containerisation and container management
  • Proficiency in shell scripting (Bash/Power-Shell)
  • Working knowledge of monitoring and logging tools
  • Knowledge of network security and firewall configuration
  • Strong communication and collaboration skills, with the ability to work effectively within a team
  • environment
  • Understanding of networking concepts and protocols in AWS and/or Azure
Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹12L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
SQL
NOSQL Databases
+3 more

About Role

We are seeking a skilled Backend Engineer with 2+ years of experience to join our dynamic team, focusing on building scalable web applications using Python frameworks (Django/FastAPI) and cloud technologies. You'll be instrumental in developing and maintaining our cloud-native backend services.


Responsibilities:

  1. Design and develop scalable backend services using Django and FastAPI
  2. Create and maintain RESTful APIs
  3. Implement efficient database schemas and optimize queries
  4. Implement containerisation using Docker and container orchestration
  5. Design and implement cloud-native solutions using microservices architecture
  6. Participate in technical design discussions, code reviews and maintain coding standards
  7. Document technical specifications and APIs
  8. Collaborate with cross-functional teams to gather requirements, prioritise tasks, and contribute to project completion.

Requirements:

  1. Experience with Django and/or Fast-API (2+ years)
  2. Proficiency in SQL and ORM frameworks
  3. Docker containerisation and orchestration
  4. Proficiency in shell scripting (Bash/Power-Shell)
  5. Understanding of micro-services architecture
  6. Experience building server-less back end
  7. Knowledge of deployment and debugging on cloud platforms (AWS/Azure)
Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
FastAPI
skill iconFlask
API management
RESTful APIs
+8 more

Job Title : Python Developer – API Integration & AWS Deployment

Experience : 5+ Years

Location : Bangalore

Work Mode : Onsite


Job Overview :

We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.

The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.


Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.


Key Responsibilities :

Python Development & API Integration :

  • Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
  • Automate simulations and workflows using the PSS®E Python API (psspy).
  • Implement robust bulk case processing, result extraction, and automated reporting systems.


AWS Cloud Deployment :

  • Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
  • Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
  • Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.


Required Skills :

  • 5+ Years of professional experience in Python development.
  • Hands-on experience with RESTful API development (FastAPI/Flask).
  • Solid experience working with PSS®E and its psspy Python API.
  • Strong understanding of AWS services, deployment, and best practices.
  • Proficiency in automation, scripting, and report generation.
  • Knowledge of cloud security and monitoring tools like IAM and CloudWatch.

Good to Have :

  • Experience in power system simulation and electrical engineering concepts.
  • Familiarity with CI/CD tools for AWS deployments.
Read more
I-Stem

at I-Stem

2 candid answers
Sahil Garg
Posted by Sahil Garg
Bengaluru (Bangalore)
2 - 4 yrs
₹20L - ₹25L / yr
skill iconPython
PyTorch
TensorFlow
skill iconDocker
skill iconKubernetes
+2 more

You will:

  • Collaborate with the I-Stem Voice AI team and CEO to design, build and ship new agent capabilities
  • Develop, test and refine end-to-end voice agent models (ASR, NLU, dialog management, TTS)
  • Stress-test agents in noisy, real-world scenarios and iterate for improved robustness and low latency
  • Research and prototype cutting-edge techniques (e.g. robust speech recognition, adaptive language understanding)
  • Partner with backend and frontend engineers to seamlessly integrate AI components into live voice products
  • Monitor agent performance in production, analyze failure cases, and drive continuous improvement
  • Occasionally demo our Voice AI solutions at industry events and user forums


You are:

  • An AI/Software Engineer with hands-on experience in speech-centric ML (ASR, NLU or TTS)
  • Skilled in building and tuning transformer-based speech models and handling real-time audio pipelines
  • Obsessed with reliability: you design experiments to push agents to their limits and root-cause every error
  • A clear thinker who deconstructs complex voice interactions from first principles
  • Passionate about making voice technology inclusive and accessible for diverse users
  • Comfortable moving fast in a small team, yet dogged about code quality, testing and reproducibility


Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Hyderabad, Bengaluru (Bangalore), Chennai
4.5 - 6 yrs
₹5L - ₹20L / yr
skill iconJava
06692
Microservices
skill iconAmazon Web Services (AWS)
J2EE

Job Summary:

We are seeking a skilled and experienced Java Developer with hands-on expertise in AWS, Spring Boot, and Microservices architecture. As a core member of our backend development team, you will design and build scalable cloud-native applications that support high-performance systems and business logic.

Key Responsibilities:

  • Design, develop, and maintain backend services using Java (Spring Boot).
  • Build and deploy microservices-based architectures hosted on AWS.
  • Collaborate with DevOps and architecture teams to ensure scalable and secure cloud solutions.
  • Write clean, efficient, and well-documented code.
  • Optimize application performance and troubleshoot production issues.
  • Participate in code reviews, technical discussions, and architecture planning.

Must-Have Skills:

  • 4.5+ years of experience in Java development.
  • Strong proficiency in Spring Boot and RESTful APIs.
  • Proven hands-on experience with AWS services (EC2, S3, Lambda, RDS, etc.).
  • Solid understanding of microservices architecture, CI/CD, and containerization tools.
  • Experience with version control (Git), and deployment tools.


Read more
Root Node India
Remote, Bengaluru (Bangalore)
3 - 4 yrs
₹12L - ₹14L / yr
skill iconJava
skill iconSpring Boot
RESTful APIs
skill iconAmazon Web Services (AWS)
skill iconMongoDB
+1 more

About Root Node

We’re an early-stage startup building intelligent tools for planning, scheduling, and optimization—starting with timetabling and warehouse logistics. Backed by deep domain expertise and a growing customer pipeline, we’re now building our core tech team. This is not just a coding job — it's a chance to build something meaningful from the ground up.


About the job

  • Design and implement robust backend systems and APIs using Java or similar backend language and Spring Boot or equivalent frameworks
  • Integrate backend services with existing custom ERP systems 
  • Work closely with the founder on product architecture, feature prioritization, and go-to-market feedback
  • Take full ownership of features — from system design and development to deployment and iterative improvements
  • Help shape our engineering culture and technical foundations


You're a Great Fit If You:

  • Have 3+ years of experience in backend development
  • Are strong in Java or similar languages (e.g., Kotlin, Go, Node.js)
  • Have solid experience with Spring Boot or equivalent backend frameworks
  • Have integrated with ERP or enterprise systems in production environments
  • Are comfortable with both SQL (PostgreSQL) and NoSQL (MongoDB)
  • Understand REST API development, authentication, Docker
  • Have an entrepreneurial mindset — you're excited about ownership, ambiguity, and making decisions that shape the product and company
  • Want more than just a job — you want to build, solve, and learn rapidly


What We Offer

  • Competitive salary
  • High degree of ownership and autonomy
  • Ability to shape the tech and product direction from Day 1
  • Transparent and fast decision-making culture
  • A builder’s environment — solve real-world problems with real impact
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Hyderabad
4 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
Data engineering
skill iconAmazon Web Services (AWS)
RESTful APIs
Microservices
+9 more

Job Title : Python Data Engineer

Experience : 4+ Years

Location : Bangalore / Hyderabad (On-site)


Job Summary :

We are seeking a skilled Python Data Engineer to work on cloud-native data platforms and backend services.

The role involves building scalable APIs, working with diverse data systems, and deploying containerized services using modern cloud infrastructure.


Mandatory Skills : Python, AWS, RESTful APIs, Microservices, SQL/PostgreSQL/NoSQL, Docker, Kubernetes, CI/CD (Jenkins/GitLab CI/AWS CodePipeline)


Key Responsibilities :

  • Design, develop, and maintain backend systems using Python.
  • Build and manage RESTful APIs and microservices architectures.
  • Work extensively with AWS cloud services for deployment and data storage.
  • Implement and manage SQL, PostgreSQL, and NoSQL databases.
  • Containerize applications using Docker and orchestrate with Kubernetes.
  • Set up and maintain CI/CD pipelines using Jenkins, GitLab CI, or AWS CodePipeline.
  • Collaborate with teams to ensure scalable and reliable software delivery.
  • Troubleshoot and optimize application performance.


Must-Have Skills :

  • 4+ years of hands-on experience in Python backend development.
  • Strong experience with AWS cloud infrastructure.
  • Proficiency in building microservices and APIs.
  • Good knowledge of relational and NoSQL databases.
  • Experience with Docker and Kubernetes.
  • Familiarity with CI/CD tools and DevOps processes.
  • Strong problem-solving and collaboration skills.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
5 - 8 yrs
₹10L - ₹24L / yr
Drupal
skill iconPHP
skill iconJavascript
Custom Module & Theming Development
skill iconAmazon Web Services (AWS)
+5 more

Job Title : Full Stack Drupal Developer

Experience : Minimum 5 Years

Location : Hyderabad / Bangalore / Mumbai / Pune / Chennai / Gurgaon (Hybrid or On-site)

Notice Period : Immediate to 15 Days Preferred


Job Summary :

We are seeking a skilled and experienced Full Stack Drupal Developer with a strong background in Drupal (version 8 and above) for both front-end and back-end development. The ideal candidate will have hands-on experience in AWS deployments, Drupal theming and module development, and a solid understanding of JavaScript, PHP, and core Drupal architecture. Acquia certifications and contributions to the Drupal community are highly desirable.


Mandatory Skills :

Drupal 8+, PHP, JavaScript, Custom Module & Theming Development, AWS (EC2, Lightsail, S3, CloudFront), Acquia Certified, Drupal Community Contributions.


Key Responsibilities :

  • Develop and maintain full-stack Drupal applications, including both front-end (theming) and back-end (custom module) development.
  • Deploy and manage Drupal applications on AWS using services like EC2, Lightsail, S3, and CloudFront.
  • Work with the Drupal theming layer and module layer to build custom and reusable components.
  • Write efficient and scalable PHP code integrated with JavaScript and core JS concepts.
  • Collaborate with UI/UX teams to ensure high-quality user experiences.
  • Optimize performance and ensure high availability of applications in cloud environments.
  • Contribute to the Drupal community and utilize contributed modules effectively.
  • Follow best practices for code versioning, documentation, and CI/CD deployment processes.


Required Skills & Qualifications :

  • Minimum 5 Years of hands-on experience in Drupal development (Drupal 8 onwards).
  • Strong experience in front-end (theming, JavaScript, HTML, CSS) and back-end (custom module development, PHP).
  • Experience with Drupal deployment on AWS, including services such as EC2, Lightsail, S3, and CloudFront.
  • Proficiency in JavaScript, core JS concepts, and PHP coding.
  • Acquia certifications such as:
  • Drupal Developer Certification
  • Site Management Certification
  • Acquia Certified Developer (preferred)
  • Experience with contributed modules and active participation in the Drupal community is a plus.
  • Familiarity with version control (Git), Agile methodologies, and modern DevOps tools.


Preferred Certifications :

  • Acquia Certified Developer.
  • Acquia Site Management Certification.
  • Any relevant AWS certifications are a bonus.
Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
3 - 5 yrs
Upto ₹20L / yr (Varies
)
skill iconJava
skill iconSpring Boot
RESTful APIs
Large Language Models (LLM)
Generative AI
+3 more

Key Responsibilities

  • Develop and maintain backend services and APIs using Java (Spring Boot preferred).
  • Integrate Large Language Models (LLMs) and Generative AI models (e.g., OpenAI, Hugging Face, LangChain) into applications.
  • Collaborate with data scientists to build data pipelines and enable intelligent application features.
  • Design scalable systems to support AI model inference and deployment.
  • Work with cloud platforms (AWS, GCP, or Azure) for deploying AI-driven services.
  • Write clean, maintainable, and well-tested code.
  • Participate in code reviews and technical discussions.


Required Skills

  • 3–5 years of experience in Java development (preferably with Spring Boot).
  • Experience working with RESTful APIs, microservices, and cloud-based deployments.
  • Exposure to LLMs, NLP, or GenAI tools (OpenAI, Cohere, Hugging Face, LangChain, etc.).
  • Familiarity with Python for data science/ML integration is a plus.
  • Good understanding of software engineering best practices (CI/CD, testing, etc.).
  • Ability to work collaboratively in cross-functional teams.
Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Bengaluru (Bangalore)
6 - 9 yrs
₹15L - ₹30L / yr
skill iconNodeJS (Node.js)
Relational Database (RDBMS)
skill iconReact.js
skill iconAngular (2+)
SQL
+8 more

Role overview


  • Overall 5 to 7 years of experience. Node.js experience is must.
  • At least 3+ years of experience or couple of large-scale products delivered on microservices.
  • Strong design skills on microservices and AWS platform infrastructure.
  • Excellent programming skill in Python, Node.js and Java.
  • Hands on development in rest API’s.
  • Good understanding of nuances of distributed systems, scalability, and availability.


What would you do here


  • To Work as a Backend Developer in developing Cloud Web Applications
  • To be part of the team working on various types of web applications related to Mortgage Finance.
  • Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
  • You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
  • Experienced in Relational and No-SQL databases and scalable design.
  • Experience in solving challenging problems by developing elegant, maintainable code.
  • Delivered rapid iterations of software based on user feedback and metrics.
  • Help the team make key decisions on our product and technology direction.
  • You actively contribute to the adoption of frameworks, standards, and new technologies.
Read more
hirezyai
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 10 yrs
₹12L - ₹25L / yr
AgaroCD
skill iconKubernetes
skill iconDocker
helm
Terraform
+9 more

Job Summary:

We are seeking a skilled DevOps Engineer to design, implement, and manage CI/CD pipelines, containerized environments, and infrastructure automation. The ideal candidate should have hands-on experience with ArgoCD, Kubernetes, and Docker, along with a deep understanding of cloud platforms and deployment strategies.

Key Responsibilities:

  • CI/CD Implementation: Develop, maintain, and optimize CI/CD pipelines using ArgoCD, GitOps, and other automation tools.
  • Container Orchestration: Deploy, manage, and troubleshoot containerized applications using Kubernetes and Docker.
  • Infrastructure as Code (IaC): Automate infrastructure provisioning with Terraform, Helm, or Ansible.
  • Monitoring & Logging: Implement and maintain observability tools like Prometheus, Grafana, ELK, or Loki.
  • Security & Compliance: Ensure best security practices in containerized and cloud-native environments.
  • Cloud & Automation: Manage cloud infrastructure on AWS, Azure, or GCP with automated deployments.
  • Collaboration: Work closely with development teams to optimize deployments and performance.

Required Skills & Qualifications:

  • Experience: 5+ years in DevOps, Site Reliability Engineering (SRE), or Infrastructure Engineering.
  • Tools & Tech: Strong knowledge of ArgoCD, Kubernetes, Docker, Helm, Terraform, and CI/CD pipelines.
  • Cloud Platforms: Experience with AWS, GCP, or Azure.
  • Programming & Scripting: Proficiency in Python, Bash, or Go.
  • Version Control: Hands-on with Git and GitOps workflows.
  • Networking & Security: Knowledge of ingress controllers, service mesh (Istio/Linkerd), and container security best practices.

Nice to Have:

  • Experience with Kubernetes Operators, Kustomize, or FluxCD.
  • Exposure to serverless architectures and multi-cloud deployments.
  • Certifications in CKA, AWS DevOps, or similar.


Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Pune, Kolkata
4 - 6 yrs
₹7L - ₹24L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
NumPy
pandas

Key Technical Skillsets-

  • Design, develop, and maintain scalable applications using AWS services, Python, and Boto3.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement best practices for cloud architecture and application development.
  • Optimize applications for maximum speed and scalability.
  • Troubleshoot and resolve issues in development, test, and production environments.
  • Write clean, maintainable, and efficient code.
  • Participate in code reviews and contribute to team knowledge sharing.


Read more
HeyCoach
DeepanRaj R
Posted by DeepanRaj R
Bengaluru (Bangalore)
4 - 12 yrs
₹0.1L - ₹0.1L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconReact.js
Data Structures
Natural Language Processing (NLP)
+5 more


Tech Lead(Fullstack) – Nexa (Conversational Voice AI Platform)

Location: Bangalore Type: Full-time

Experience: 4+ years (preferably in early-stage startups)

Tech Stack: Python (core), Node.js, React.js

 

 

About Nexa

Nexa is a new venture by the founders of HeyCoachPratik Kapasi and Aditya Kamat—on a mission to build the most intuitive voice-first AI platform. We’re rethinking how humans interact with machines using natural, intelligent, and fast conversational interfaces.

We're looking for a Tech Lead to join us at the ground level. This is a high-ownership, high-speed role for builders who want to move fast and go deep.

 

What You’ll Do

●     Design, build, and scale backend and full-stack systems for our voice AI engine

●     Work primarily with Python (core logic, pipelines, model integration), and support full-stack features using Node.js and React.js

●     Lead projects end-to-end—from whiteboard to production deployment

●     Optimize systems for performance, scale, and real-time processing

●     Collaborate with founders, ML engineers, and designers to rapidly prototype and ship features

 ●     Set engineering best practices, own code quality, and mentor junior team members as we grow

 

✅ Must-Have Skills

●     4+ years of experience in Python, building scalable production systems

●     Has led projects independently, from design through deployment

●     Excellent at executing fast without compromising quality

●     Strong foundation in system design, data structures and algorithms

●     Hands-on experience with Node.js and React.js in a production setup

●     Deep understanding of backend architecture—APIs, microservices, data flows

●     Proven success working in early-stage startups, especially during 0→1 scaling phases

●     Ability to debug and optimize across the full stack

●     High autonomy—can break down big problems, prioritize, and deliver without hand-holding

  

🚀 What We Value

●     Speed > Perfection: We move fast, ship early, and iterate

●     Ownership mindset: You act like a founder-even if you're not one

●     Technical depth: You’ve built things from scratch and understand what’s under the hood

●     Product intuition: You don’t just write code—you ask if it solves the user’s problem

●     Startup muscle: You’re scrappy, resourceful, and don’t need layers of process

●     Bias for action: You unblock yourself and others. You push code and push thinking

Humility and curiosity

: You challenge ideas, accept better ones, and never stop learning

 

💡 Nice-to-Have

●     Experience with NLP, speech interfaces, or audio processing

●     Familiarity with cloud platforms (GCP/AWS), CI/CD, Docker, Kubernetes

●     Contributions to open-source or technical blogs

●     Prior experience integrating ML models into production systems

 

Why Join Nexa?

●     Work directly with founders on a product that pushes boundaries in voice AI

●     Be part of the core team shaping product and tech from day one

●     High-trust environment focused on output and impact, not hours

●     Flexible work style and a flat, fast culture

Read more
Gruve
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Pune
3 - 6 yrs
Upto ₹40L / yr (Varies
)
skill iconJava
skill iconSpring Boot
skill iconAmazon Web Services (AWS)
Windows Azure
DevOps
+1 more

We are seeking an experienced and highly skilled Technical Lead with a strong background in Java, SaaS architectures, firewalls, and cybersecurity products, including SIEM and SOAR platforms. The ideal candidate will lead technical initiatives, design and implement scalable systems, and drive best practices across the engineering team. This role requires deep technical expertise, leadership abilities, and a passion for building secure and high-performing security solutions.


Key Roles & Responsibilities:

  • Lead the design and development of scalable and secure software solutions using Java.
  • Architect and build SaaS-based cybersecurity applications, ensuring high availability, performance, and reliability.
  • Provide technical leadership, mentoring, and guidance to the development team.
  • Ensure best practices in secure coding, threat modeling, and compliance with industry standards.
  • Collaborate with cross-functional teams, including Product Management, Security, and DevOps to deliver high-quality security solutions.
  • Design and implement security analytics, automation workflows and ITSM integrations.
  •  Drive continuous improvements in engineering processes, tools, and technologies.
  • Troubleshoot complex technical issues and lead incident response for critical production systems.


Basic Qualifications:

  • A bachelor’s or master’s degree in computer science, electronics engineering or a related field
  • 3-6 years of software development experience, with expertise in Java.
  • Strong background in building SaaS applications with cloud-native architectures (AWS, GCP, or Azure).
  • In-depth understanding of microservices architecture, APIs, and distributed systems.
  • Experience with containerization and orchestration tools like Docker and Kubernetes.
  • Knowledge of DevSecOps principles, CI/CD pipelines, and infrastructure as code (Terraform, Ansible, etc.).
  • Strong problem-solving skills and ability to work in an agile, fast-paced environment.
  • Excellent communication and leadership skills, with a track record of mentoring engineers.

 

Preferred Qualifications:

  • Experience with cybersecurity solutions, including SIEM (e.g., Splunk, ELK, IBM QRadar) and SOAR (e.g., Palo Alto XSOAR, Swimlane).
  • Knowledge of zero-trust security models and secure API development.
  • Hands-on experience with machine learning or AI-driven security analytics.
Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Bengaluru (Bangalore)
9 - 15 yrs
₹20L - ₹30L / yr
skill iconAmazon Web Services (AWS)
skill iconKubernetes
MySQL
Oracle
Amazon S3
+1 more

Job description

● Design effective, scalable architectures on top of cloud technologies such as AWS and Kubernetes

● Mentor other software engineers, including actively participating in peer code and architecture review

● Participate in all parts of the development lifecycle from design to coding to deployment to maintenance and operations

● Kickstart new ideas, build proof of concepts and jumpstart newly funded projects

● Demonstrate ability to work independently with minimal supervision

● Embed with other engineering teams on challenging initiatives and time sensitive projects

● Collaborate with other engineering teams on challenging initiatives and time sensitive projects



Education and Experience

● BS degree in Computer Science or related technical field or equivalent practical experience.

● 9+ years of professional software development experience focused on payments and/or billing and customer accounts. Worked with worldwide payments, billing systems, PCI Compliance & payment gateways.

Technical and Functional

● Extensive knowledge of micro service development using Spring, Spring Boot, Java - built on top of Kubernetes and public cloud computing such as AWS, Lambda, S3.

● Experience with relational databases (MySQL, DB2 or Oracle) and NoSQL databases

● Experience with unit testing and test driven development

Technologies at Constant Contact

Working on the Constant Contact platform provides our engineers with an opportunity to produce high impact work inside of our multifaceted platform (Email, Social, SMS, E-Commerce, CRM, Customer Data Platform, MLBased Recommendations & Insights, and more).

As a member of our team, you'll be utilizing the latest technologies and frameworks (React/SPA, JavaScript/TypeScript, Swift, Kotlin, GraphQL, etc) and deploying code to our cloud-first microservice infrastructure (declarative CI/CD, GitOps managed kubernetes) with regular opportunities to level up your skills.

● Past experience of working with and integrating payment gateways and processors, online payment methods, and billing systems.

● Familiar with integrating Stripe/Plaid/PayPal/Adyen/Cybersource or similar systems along with PCI compliance.

● International software development and payments experience is a plus.

● Knowledge of DevOps and CI/CD, automated test and build tools ( Jenkins & Gradle/Maven)

● Experience integrating with sales tax engines is a plus.

● Familiar with tools like Splunk, New relic or similar tools like datadog, elastic elk, amazon

cloudwatch.


● Good to have - Experience with React, Backbone, Marionette or other front end frameworks.


Cultural

● Strong verbal and written communication skills.

● Flexible attitude and willingness to frequently move between different teams, software architectures and priorities.

● Desire to collaborate with our other product teams to think strategically about how to solve problems.

Our team

● We focus on cross-functional team collaboration where engineers, product managers, and designers all work together to solve customer problems and build exciting features.

● We love new ideas and are eager to see what your experiences can bring to help influence our technical and product vision.

● Collaborate/Overlap with the teams in Eastern Standard Time (EST), USA.


Read more
Talent Pro
Bengaluru (Bangalore)
6 - 8 yrs
₹20L - ₹45L / yr
skill iconPHP
skill iconNodeJS (Node.js)
skill iconJava
skill iconAmazon Web Services (AWS)
RabbitMQ
+2 more

Strong Senior Backend Engineer profile

Mandatory (Experience 1) - Must have more than 6+ YOE in Software Development

Mandatory (Experience 2) - Should have strong backend development experience in any backend language - Java, Javascript (NodeJS), Go, PHP (PHP experience is preferred)

Mandatory (Core Skill 1) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL

Mandatory (Core skill 2) - Experience with async workflows and messaging queues such as( RabbitMq, Kafka, Message Broker / Queue, Google Pub / Sub, Kinesis etc)

Mandatory (Core Skills 3) - Experience in Cloud - AWS / Google Cloud / Azure

Mandatory (Company) - Product Companies only

Mandatory ( Education) - BE / BTECH / MCA

Read more
appscrip

at appscrip

2 recruiters
Kanika Gaur
Posted by Kanika Gaur
Bengaluru (Bangalore), Surat
3 - 5 yrs
₹4.8L - ₹11L / yr
skill iconAmazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)

Job Title: Lead DevOps Engineer

Experience Required: 4 to 5 years in DevOps or related fields

Employment Type: Full-time


About the Role:

We are seeking a highly skilled and experienced Lead DevOps Engineer. This role will focus on driving the design, implementation, and optimization of our CI/CD pipelines, cloud infrastructure, and operational processes. As a Lead DevOps Engineer, you will play a pivotal role in enhancing the scalability, reliability, and security of our systems while mentoring a team of DevOps engineers to achieve operational excellence.


Key Responsibilities:

Infrastructure Management: Architect, deploy, and maintain scalable, secure, and resilient cloud infrastructure (e.g., AWS, Azure, or GCP).

CI/CD Pipelines: Design and optimize CI/CD pipelines, to improve development velocity and deployment quality.

Automation: Automate repetitive tasks and workflows, such as provisioning cloud resources, configuring servers, managing deployments, and implementing infrastructure as code (IaC) using tools like Terraform, CloudFormation, or Ansible.

Monitoring & Logging: Implement robust monitoring, alerting, and logging systems for enterprise and cloud-native environments using tools like Prometheus, Grafana, ELK Stack, NewRelic or Datadog.

Security: Ensure the infrastructure adheres to security best practices, including vulnerability assessments and incident response processes.

Collaboration: Work closely with development, QA, and IT teams to align DevOps strategies with project goals.

Mentorship: Lead, mentor, and train a team of DevOps engineers to foster growth and technical expertise.

Incident Management: Oversee production system reliability, including root cause analysis and performance tuning.


Required Skills & Qualifications:

Technical Expertise:

Strong proficiency in cloud platforms like AWS, Azure, or GCP.

Advanced knowledge of containerization technologies (e.g., Docker, Kubernetes).

Expertise in IaC tools such as Terraform, CloudFormation, or Pulumi.

Hands-on experience with CI/CD tools, particularly Bitbucket Pipelines, Jenkins, GitLab CI/CD, Github Actions or CircleCI.

Proficiency in scripting languages (e.g., Python, Bash, PowerShell).

Soft Skills:

Excellent communication and leadership skills.

Strong analytical and problem-solving abilities.

Proven ability to manage and lead a team effectively.

Experience:

4 years + of experience in DevOps or Site Reliability Engineering (SRE).

4+ years + in a leadership or team lead role, with proven experience managing distributed teams, mentoring team members, and driving cross-functional collaboration.

Strong understanding of microservices, APIs, and serverless architectures.


Nice to Have:

Certifications like AWS Certified Solutions Architect, Kubernetes Administrator, or similar.

Experience with GitOps tools such as ArgoCD or Flux.

Knowledge of compliance standards (e.g., GDPR, SOC 2, ISO 27001).


Perks & Benefits:

Competitive salary and performance bonuses.

Comprehensive health insurance for you and your family.

Professional development opportunities and certifications, including sponsored certifications and access to training programs to help you grow your skills and expertise.

Flexible working hours and remote work options.

Collaborative and inclusive work culture.


Join us to build and scale world-class systems that empower innovation and deliver exceptional user experiences.


You can directly contact us: Nine three one six one two zero one three two

Read more
Bengaluru (Bangalore)
2 - 4 yrs
₹7L - ₹12L / yr
skill iconJava
06692
skill iconAmazon Web Services (AWS)
NOSQL Databases

Backend (Primary Focus) ● Strong knowledge and experience in Object-Oriented Programming (OOP) concepts. ● Strong understanding of Java, Spring Boot, and REST API development. ● Experience in Test-Driven Development (TDD) with Spring Boot. ● Proficiency in developing APIs using Redis and relational databases (MySQL preferred). ● Strong understanding of the AWS cloud platform, with experience using services like S3, Lambda. ● Good understanding of code versioning tools (Git) and bug-tracking systems (JIRA, etc.). ● Knowledge of DocumentDB or any other NoSQL document database is a plus. Frontend (Good to Have / Preferred for Full-Stack Evaluation) ● Hands-on experience with React.js, including state management (e.g., Redux, Context API). ● Experience with modern UI development, including CSS frameworks (Tailwind, Material-UI, Bootstrap, etc.). ● Understanding of REST API integration and handling API calls efficiently in React. ● Familiarity with component-driven development and frontend testing (e.g., Jest, React Testing Library).

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Chennai, Kochi (Cochin)
6 - 9 yrs
₹7L - ₹15L / yr
skill iconAmazon Web Services (AWS)
sagemaker
skill iconMachine Learning (ML)
skill iconDocker
skill iconPython
  • Design, develop, and maintain data pipelines and ETL workflows on AWS platform
  • Work with AWS services like S3, Glue, Lambda, Redshift, EMR, and Athena for data ingestion, transformation, and analytics
  • Collaborate with Data Scientists, Analysts, and Business teams to understand data requirements
  • Optimize data workflows for performance, scalability, and reliability
  • Troubleshoot data issues, monitor jobs, and ensure data quality and integrity
  • Write efficient SQL queries and automate data processing tasks
  • Implement data security and compliance best practices
  • Maintain technical documentation and data pipeline monitoring dashboards
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
Deqode

at Deqode

1 recruiter
Sneha Jain
Posted by Sneha Jain
Bengaluru (Bangalore), Mumbai, Pune, Hyderabad
4 - 7 yrs
₹10L - ₹18L / yr
skill iconSpring Boot
skill iconJava
skill iconAmazon Web Services (AWS)

Job Summary:


We are looking for an experienced Java Developer with 4+years of hands-on experience to join our dynamic team. The ideal candidate will have a strong background in Java development, problem-solving skills, and the ability to work independently as well as part of a team. You will be responsible for designing, developing, and maintaining high-performance and scalable applications.


Key Responsibilities:

  • Design, develop, test, and maintain Java-based applications.
  • Write well-designed, efficient, and testable code following best software development practices.
  • Troubleshoot and resolve technical issues during development and production support.
  • Collaborate with cross-functional teams including QA, DevOps, and Product teams.
  • Participate in code reviews and provide constructive feedback.
  • Maintain proper documentation for code, processes, and configurations.
  • Support deployment and post-deployment monitoring during night shift hours.


Required Skills:

  • Strong programming skills in Java 8 or above.
  • Experience with Spring Framework (Spring Boot, Spring MVC, etc.).
  • Proficiency in RESTful APIsMicroservices Architecture, and Web Services.
  • Familiarity with SQL and relational databases like MySQL, PostgreSQL, or Oracle.
  • Hands-on experience with version control systems like Git.
  • Understanding of Agile methodologies.
  • Experience with build tools like Maven/Gradle.
  • Knowledge of unit testing frameworks (JUnit/TestNG).


Preferred Skills (Good to Have):

  • Experience with cloud platforms (AWS, Azure, or GCP).
  • Familiarity with CI/CD pipelines.
  • Basic understanding of frontend technologies like JavaScript, HTML, CSS.


Read more
Alpha

at Alpha

2 candid answers
Yash Makhecha
Posted by Yash Makhecha
Remote, Bengaluru (Bangalore)
1 - 6 yrs
₹4L - ₹12L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconReact.js
TypeScript
skill iconDocker
+10 more

Full Stack Engineer

Location: Remote (India preferred) · Type: Full-time · Comp: Competitive salary + early-stage stock



About Alpha

Alpha is building the simplest way for anyone to create AI agents that actually get work done. Our platform turns messy prompt chaining, data schemas, and multi-tool logic into a clean, no-code experience. We’re backed, funded, and racing toward our v1 launch. Join us on the ground floor and shape the architecture, the product, and the culture.



The Role

We’re hiring two versatile full-stack engineers. One will lean infra/back-end, the other front-end/LLM integration, but both will ship vertical slices end-to-end.


You will:

  • Design and build the agent-execution runtime (LLMs, tools, schemas).
  • Stand up secure VPC deployments with Docker, Terraform, and AWS or GCP.
  • Build REST/GraphQL APIs, queues, Postgres/Redis layers, and observability.
  • Create a React/Next.js visual workflow editor with drag-and-drop blocks.
  • Build the Prompt Composer UI, live testing mode, and cost dashboard.
  • Integrate native tools: search, browser, CRM, payments, messaging, and more.
  • Ship fast—design, code, test, launch—and own quality (no separate QA team).
  • Talk to early users and fold feedback into weekly releases.



What We’re Looking For


  • 3–6 years building production web apps at startup pace.
  • Strong TypeScript + Node.js or Python.
  • Solid React/Next.js and modern state management.
  • Comfort with AWS or GCP, Docker, and CI/CD.
  • Bias for ownership from design to deploy.


Nice but not required: Terraform or CDK, IAM/VPC networking, vector DBs or RAG pipelines, LLM API experience, React-Flow or other canvas libs, GraphQL or event streaming, prior dev-platform work.


We don’t expect every box ticked—show us you learn fast and ship.



What You’ll Get


• Meaningful equity at the earliest stage.

• A green-field codebase you can architect the right way.

• Direct access to the founder—instant decisions, no red tape.

• Real customers from day one; your code goes live, not to backlog.

• Stipend for hardware, LLM credits, and professional growth.



Come build the future of work—where AI agents handle the busywork and people do the thinking.

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore)
3 - 4 yrs
₹5L - ₹18L / yr
skill iconAmazon Web Services (AWS)
Terraform
skill iconKubernetes
Migration

About the Role:

We are looking for a skilled AWS DevOps Engineer to join our Cloud Operations team in Bangalore. This hybrid role is ideal for someone with hands-on experience in AWS and a strong background in application migration from on-premises to cloud environments. You'll play a key role in driving cloud adoption, optimizing infrastructure, and ensuring seamless cloud operations.

Key Responsibilities:

  • Manage and maintain AWS cloud infrastructure and services.
  • Lead and support application migration projects from on-prem to cloud.
  • Automate infrastructure provisioning using Infrastructure as Code (IaC) tools.
  • Monitor cloud environments and optimize cost, performance, and reliability.
  • Collaborate with development, operations, and security teams to implement DevOps best practices.
  • Troubleshoot and resolve infrastructure and deployment issues.

Required Skills:

  • 3–5 years of experience in AWS cloud environment.
  • Proven experience with on-premises to cloud application migration.
  • Strong understanding of AWS core services (EC2, VPC, S3, IAM, RDS, etc.).
  • Solid scripting skills (Python, Bash, or similar).

Good to Have:

  • Experience with Terraform for Infrastructure as Code.
  • Familiarity with Kubernetes for container orchestration.
  • Exposure to CI/CD tools like Jenkins, GitLab, or AWS CodePipeline.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Bengaluru (Bangalore), Mumbai
5 - 10 yrs
Best in industry
skill iconJava
06692
Microservices
skill iconAmazon Web Services (AWS)
Apache Kafka
+1 more

Job Description: We are looking for a talented and motivated Software Engineer with

expertise in both Windows and Linux operating systems and solid experience in Java

technologies. The ideal candidate should be proficient in data structures and algorithms, as

well as frameworks like Spring MVC, Spring Boot, and Hibernate. Hands-on experience

working with MySQL databases is also essential for this role.


Responsibilities:

● Design, develop, test, and maintain software applications using Java technologies.

● Implement robust solutions using Spring MVC, Spring Boot, and Hibernate frameworks.

● Develop and optimize database operations with MySQL.

● Analyze and solve complex problems by applying knowledge of data structures and

algorithms.

● Work with both Windows and Linux environments to develop and deploy solutions.

● Collaborate with cross-functional teams to deliver high-quality products on time.

● Ensure application security, performance, and scalability.

● Maintain thorough documentation of technical solutions and processes.

● Debug, troubleshoot, and upgrade legacy systems when required.

Requirements:

● Operating Systems: Expertise in Windows and Linux environments.

● Programming Languages & Technologies: Strong knowledge of Java (Core Java, Java 8+).

● Frameworks: Proficiency in Spring MVC, Spring Boot, and Hibernate.

● Algorithms and Data Structures: Good understanding and practical application of DSA

concepts.

● Databases: Experience with MySQL – writing queries, stored procedures, and performance

tuning.

● Version Control Systems: Experience with tools like Git.

● Deployment: Knowledge of CI/CD pipelines and tools such as Jenkins, Docker (optional)

Read more
Gruve
Bengaluru (Bangalore), Pune
5 - 9 yrs
Upto ₹60L / yr (Varies
)
Generative AI
Retrieval Augmented Generation (RAG)
Chatbot
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

We are seeking a talented Engineer to join our AI team. You will technically lead experienced software and machine learning engineers to develop, test, and deploy AI-based solutions, with a primary focus on large language models and other machine learning applications. This is an excellent opportunity to apply your software engineering skills in a dynamic, real-world environment and gain hands-on experience in cutting-edge AI technology.


Key Roles & Responsibilities: 

  • Design and Develop AI-Powered Solutions: Architect and implement scalable AI/ML systems, focusing on Large Language Models (LLMs) and other deep learning applications.
  • End-to-End Model Development: Lead the entire lifecycle of AI models—from data collection and preprocessing to training, fine-tuning, evaluation, and deployment.
  • Fine-Tuning & Customization: Leverage techniques like LoRA (Low-Rank Adaptation) and Q-LoRA to efficiently fine-tune large models for specific business applications.
  • Reasoning Model Implementation: Work with advanced reasoning models such as DeepSeek-R1, exploring their applications in enterprise AI workflows.
  • Data Engineering & Dataset Creation: Design and curate high-quality datasets optimized for fine-tuning AI models, ensuring robust training and validation processes.
  • Performance Optimization & Efficiency: Optimize model inference, computational efficiency, and resource utilization for large-scale AI applications.
  • MLOps & CI/CD Pipelines: Implement best practices for MLOps, ensuring automated training, deployment, monitoring, and continuous improvement of AI models.
  • Cloud & Edge AI Deployment: Deploy and manage AI solutions in cloud environments (AWS, Azure, GCP) and explore edge AI deployment where applicable.
  • API Development & Microservices: Develop RESTful APIs and microservices to integrate AI models seamlessly into enterprise applications.
  • Security, Compliance & Ethical AI: Ensure AI solutions comply with industry standards, data privacy laws (e.g., GDPR, HIPAA), and ethical AI guidelines.
  • Collaboration & Stakeholder Engagement: Work closely with product managers, data engineers, and business teams to translate business needs into AI-driven solutions.
  • Mentorship & Technical Leadership: Guide and mentor junior engineers, fostering best practices in AI/ML development, model fine-tuning, and software engineering.
  • Research & Innovation: Stay updated with emerging AI trends, conduct experiments with cutting-edge architectures and fine-tuning techniques, and drive innovation within the team.

Basic Qualifications: 

  • A master's degree or PhD in Computer Science, Data Science, Engineering, or a related field 
  • Experience: 5-8 Years 
  • Strong programming skills in Python and Java 
  • Good understanding of machine learning fundamentals 
  • Hands-on experience with Python and common ML libraries (e.g., PyTorch, TensorFlow, scikit-learn) 
  • Familiar with frontend development and frameworks like React 
  • Basic knowledge of LLMs and transformer-based architectures is a plus.

Preferred Qualifications  

  • Excellent problem-solving skills and an eagerness to learn in a fast-paced environment 
  • Strong attention to detail and ability to communicate technical concepts clearly


Read more
Gruve
Pune, Bengaluru (Bangalore)
3 - 5 yrs
Upto ₹30L / yr (Varies
)
Retrieval Augmented Generation (RAG)
Generative AI
Chatbot
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more

We are seeking a talented Engineer to join our AI team. You will technically lead experienced software and machine learning engineers to develop, test, and deploy AI-based solutions, with a primary focus on large language models and other machine learning applications. This is an excellent opportunity to apply your software engineering skills in a dynamic, real-world environment and gain hands-on experience in cutting-edge AI technology.


Key Roles & Responsibilities: 

  • Design and implement software solutions that power machine learning models, particularly in LLMs 
  • Create robust data pipelines, handling data preprocessing, transformation, and integration for machine learning projects 
  • Collaborate with the engineering team to build and optimize machine learning models, particularly LLMs, that address client-specific challenges 
  • Partner with cross-functional teams, including business stakeholders, data engineers, and solutions architects to gather requirements and evaluate technical feasibility 
  • Design and implement a scale infrastructure for developing and deploying GenAI solutions 
  • Support model deployment and API integration to ensure interaction with existing enterprise systems.

Basic Qualifications: 

  • A master's degree or PhD in Computer Science, Data Science, Engineering, or a related field 
  • Experience: 3-5 Years 
  • Strong programming skills in Python and Java 
  • Good understanding of machine learning fundamentals 
  • Hands-on experience with Python and common ML libraries (e.g., PyTorch, TensorFlow, scikit-learn) 
  • Familiar with frontend development and frameworks like React 
  • Basic knowledge of LLMs and transformer-based architectures is a plus.

Preferred Qualifications 

  • Excellent problem-solving skills and an eagerness to learn in a fast-paced environment 
  • Strong attention to detail and ability to communicate technical concepts clearly 
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Mumbai, Chennai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
aws
Amazon Redshift
+1 more

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
Bengaluru and chennai based tech startup

Bengaluru and chennai based tech startup

Agency job
via Recruit Square by Priyanka choudhary
Bengaluru (Bangalore), Chennai
6 - 12 yrs
₹19L - ₹35L / yr
Linux/Unix
TCP/IP
Windows Azure
skill iconAmazon Web Services (AWS)
SaaS
+2 more

Has substantial expertise in Linux OS, Https, Proxy knowledge, Perl, Python scripting & hands-on

Is responsible for the identification and selection of appropriate network solutions to design and deploy in environments based on business objectives and requirements.

Is skilled in developing, deploying, and troubleshooting network deployments, with deep technical knowledge, especially around Bootstrapping & Squid Proxy, Https, scripting equivalent knowledge. Further align the network to meet the Company’s objectives through continuous developments, improvements and automation.

Preferably 10+ years of experience in network design and delivery of technology centric, customer-focused services.

Preferably 3+ years in modern software-defined network and preferably, in cloud-based environments.

Diploma or bachelor’s degree in engineering, Computer Science/Information Technology, or its equivalent.

Preferably possess a valid RHCE (Red Hat Certified Engineer) certification

Preferably possess any vendor Proxy certification (Forcepoint/ Websense/ bluecoat / equivalent)

Must possess advanced knowledge in TCP/IP concepts and fundamentals.  Good understanding and working knowledge of Squid proxy, Https protocol / Certificate management.

Fundamental understanding of proxy & PAC file.

Integration experience and knowledge between modern networks and cloud service providers such as AWS, Azure and GCP will be advantageous.

Knowledge in SaaS, IaaS, PaaS, and virtualization will be advantageous.

Coding skills such as Perl, Python, Shell scripting will be advantageous.

Excellent technical knowledge, troubleshooting, problem analysis, and outside-the-box thinking.

Excellent communication skills – oral, written and presentation, across various types of target audiences.

Strong sense of personal ownership and responsibility in accomplishing the organization’s goals and objectives. Exudes confidence, able to cope under pressure and will roll-up his/her sleeves to drive a project to success in a challenging environment.

Read more
NeoGenCode Technologies Pvt Ltd
Bengaluru (Bangalore)
6 - 15 yrs
₹15L - ₹32L / yr
DBA
MySQL DBA
skill iconMongoDB
skill iconPostgreSQL
Oracle DBA
+11 more

Position Title : Senior Database Administrator (DBA)

📍 Location : Bangalore (Near Silk Board)

🏢 Work Mode : Onsite, 5 Days a Week

💼 Experience : 6+ Years

⏱️ Notice Period : Immediate to 1 Month


Job Summary :

We’re looking for an experienced Senior DBA to manage and optimize databases like MySQL, MongoDB, PostgreSQL, Oracle, and Redis. You’ll ensure performance, security, and availability of databases across our systems and work closely with engineering teams for support and improvement.


Key Responsibilities :

  • Manage and maintain MySQL, MongoDB, PostgreSQL, Oracle, and Redis databases.
  • Handle backups, restores, upgrades, and replication.
  • Optimize query performance and troubleshoot issues.
  • Ensure database security and access control.
  • Work on disaster recovery and high availability.
  • Support development teams with schema design and tuning.
  • Automate tasks using scripting (Python, Bash, etc.).
  • Collaborate with DevOps and Cloud (AWS) teams.


Must-Have Skills :

  • 6+ Years as a DBA in production environments.
  • Strong hands-on with MySQL, MongoDB, PostgreSQL, Oracle, Redis.
  • Performance tuning and query optimization.
  • Backup/recovery and disaster recovery planning.
  • Experience with AWS (RDS/EC2).
  • Scripting knowledge (Python/Bash).
  • Good understanding of database security.


Good to Have :

  • Experience with MSSQL.
  • Knowledge of tools like pgAdmin, Compass, Workbench.
  • Database certifications.
Read more
Deqode

at Deqode

1 recruiter
Mokshada Solanki
Posted by Mokshada Solanki
Bengaluru (Bangalore), Mumbai, Pune, Gurugram
4 - 5 yrs
₹4L - ₹20L / yr
SQL
skill iconAmazon Web Services (AWS)
Migration
PySpark
ETL

Job Summary:

Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.


Key Responsibilities:

  • Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
  • Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
  • Work on data migration tasks in AWS environments.
  • Monitor and improve database performance; automate key performance indicators and reports.
  • Collaborate with cross-functional teams to support data integration and delivery requirements.
  • Write shell scripts for automation and manage ETL jobs efficiently.


Required Skills:

  • Strong experience with MySQL, complex SQL queries, and stored procedures.
  • Hands-on experience with AWS Glue, PySpark, and ETL processes.
  • Good understanding of AWS ecosystem and migration strategies.
  • Proficiency in shell scripting.
  • Strong communication and collaboration skills.


Nice to Have:

  • Working knowledge of Python.
  • Experience with AWS RDS.



Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Gruve
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore), Pune
5yrs+
Upto ₹50L / yr (Varies
)
skill iconPython
SQL
Data engineering
Apache Spark
PySpark
+6 more

About the Company:

Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.

 

Why Gruve:

At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.

Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.

 

Position summary:

We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. 

Key Roles & Responsibilities:

  • Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
  • Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
  • Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
  • Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
  • Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
  • Implement data governance, security, and compliance best practices.
  • Build and maintain data models, transformations, and data marts for analytics and reporting.
  • Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
  • Automate infrastructure and deployments using Terraform, Airflow, or dbt.
  • Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
  • Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.


Basic Qualifications:

  • Bachelor’s or Master’s Degree in Computer Science or Data Science.
  • 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
  • Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
  • Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
  • Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
  • Proficiency in SQL, Python, or Scala for data transformation and analytics.
  • Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
  • Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
  • Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
  • Strong understanding of data governance, access control, and encryption strategies.
  • Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.


Preferred Qualifications:

  • Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
  • Experience in BI and analytics tools (Tableau, Power BI, Looker).
  • Familiarity with data observability tools (Monte Carlo, Great Expectations).
  • Experience with machine learning feature engineering pipelines in Databricks.
  • Contributions to open-source data engineering projects.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune, Mumbai, Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹5L - ₹10L / yr
ETL
SQL
skill iconAmazon Web Services (AWS)
PySpark
KPI

Role - ETL Developer

Work ModeHybrid

Experience- 4+ years

Location - Pune, Gurgaon, Bengaluru, Mumbai

Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL

Required Skills:

  • 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
  • Experience in Pyspark, AWS, AWS Glue
  • Experience in AWS ,Migration
  • Experience with automated scripting and tracking KPIs/metrics for database performance
  • Proficiency in shell scripting and ETL.
  • Strong communication skills and a collaborative team player
  • Knowledge of Python and AWS RDS is a plus


Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Bengaluru (Bangalore)
3 - 6 yrs
₹9L - ₹11L / yr
AWS
DevOps
Linux administration
skill iconAmazon Web Services (AWS)
skill iconPostgreSQL

Key Responsibilities:

Cloud Management:

  • Manage and troubleshoot Linux environments.
  • Create and manage Linux users on EC2 instances.
  • Handle AWS services, including ECR, EKS, EC2, SNS, SES, S3, RDS,
  • Lambda, DocumentDB, IAM, ECS, EventBridge, ALB, and SageMaker.
  • Perform start/stop operations for SageMaker and EC2 instances.
  • Solve IAM permission issues.

Containerization and Deployment:

  • Create and manage ECS services.
  • Implement IP whitelisting for enhanced security.
  • Configure target mapping for load balancers and manage Glue jobs.
  • Create load balancers (as needed).

CI/CD Setup:

  • Set up and maintain CI/CD pipelines using AWS CodeCommit, CodeBuild, CodeDeploy, and CodePipeline.

Database Management:

  • o Manage PostgreSQL RDS instances, ensuring optimal performance and security.


 

Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Minimum 3.5 years of experience in a AWS and DevOps.
  • Strong experience with Linux administration.
  • Proficient in AWS services, particularly ECR, EKS, EC2, SNS, SES, S3, RDS, DocumentDB, IAM, ECS, EventBridge, ALB, and SageMaker.
  • Experience with CI/CD tools (AWS CodeCommit, CodeBuild, CodeDeploy, CodePipeline).
  • Familiarity with PostgreSQL and database management.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Hyderabad, Pune, Jaipur, Kolkata, Indore
4 - 6 yrs
₹5L - ₹18L / yr
skill icon.NET
skill iconC#
skill iconAngular (2+)
Windows Azure
skill iconAmazon Web Services (AWS)

Job Description:

Deqode is seeking a skilled .NET Full Stack Developer with expertise in .NET Core, Angular, and C#. The ideal candidate will have hands-on experience with either AWS or Azure cloud platforms. This role involves developing robust, scalable applications and collaborating with cross-functional teams to deliver high-quality software solutions.

Key Responsibilities:

  • Develop and maintain web applications using .NET Core, C#, and Angular.
  • Design and implement RESTful APIs and integrate with front-end components.
  • Collaborate with UI/UX designers, product managers, and other developers to deliver high-quality products.
  • Deploy and manage applications on cloud platforms (AWS or Azure).
  • Write clean, scalable, and efficient code following best practices.
  • Participate in code reviews and provide constructive feedback.
  • Troubleshoot and debug applications to ensure optimal performance.
  • Stay updated with emerging technologies and propose improvements to existing systems.

Required Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Minimum of 4 years of professional experience in software development.
  • Proficiency in .NET Core, C#, and Angular.
  • Experience with cloud services (either AWS or Azure).
  • Strong understanding of RESTful API design and implementation.
  • Familiarity with version control systems like Git.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work independently and collaboratively in a team environment.

Preferred Qualifications:

  • Experience with containerization tools like Docker and orchestration platforms like Kubernetes.
  • Knowledge of CI/CD pipelines and DevOps practices.
  • Familiarity with Agile/Scrum methodologies.
  • Strong communication and interpersonal skills.

What We Offer:

  • Competitive salary and performance-based incentives.
  • Flexible working hours and remote work options.
  • Opportunities for professional growth and career advancement.
  • Collaborative and inclusive work environment.
  • Access to the latest tools and technologies.


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Indore, Bengaluru (Bangalore), Nagpur, Hyderabad, Noida, Mumbai, Jaipur, Ahmedabad, Kolkata
4 - 6 yrs
₹5L - ₹13.5L / yr
skill icon.NET
.net core
skill iconAngular (2+)
skill iconAngularJS (1.x)
skill iconReact.js
+3 more

Job Title: .NET Developer

Location: Pan India (Hybrid)

Employment Type: Full-Time

Join Date: Immediate / Within 15 Days

Experience: 4+ Years

Deqode is looking for a skilled and passionate Senior .NET Developer to join our growing tech team. The ideal candidate is an expert in building scalable web applications and has hands-on experience with cloud platforms and modern front-end technologies.


Key Responsibilities:

  • Design, develop, and maintain scalable web applications using .NET Core.
  • Work on RESTful APIs and integrate third-party services.
  • Collaborate with UI/UX designers and front-end developers using Angular or React.
  • Deploy, monitor, and maintain applications on AWS or Azure.
  • Participate in code reviews, technical discussions, and architecture planning.
  • Write clean, well-structured, and testable code following best practices.

Must-Have Skills:

  • 4+ years of experience in software development using .NET Core.
  • Proficiency with Angular or React for front-end development.
  • Strong working knowledge of AWS or Microsoft Azure.
  • Experience with SQL/NoSQL databases.
  • Excellent communication and team collaboration skills.

Education:

  • Bachelor’s/Master’s degree in Computer Science, Information Technology, or a related field.
Read more
Deqode

at Deqode

1 recruiter
Naincy Jain
Posted by Naincy Jain
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Pune, Indore, Jaipur, Kolkata, Hyderabad
4 - 6 yrs
₹3L - ₹30L / yr
DevOps
Terraform
skill iconKubernetes
skill iconAmazon Web Services (AWS)
AWS Lambda
+1 more

Required Skills:


  • Experience in systems administration, SRE or DevOps focused role
  • Experience in handling production support (on-call)
  • Good understanding of the Linux operating system and networking concepts.
  • Demonstrated competency with the following AWS services: ECS, EC2, EBS, EKS, S3, RDS, ELB, IAM, Lambda.
  • Experience with Docker containers and containerization concepts
  • Experience with managing and scaling Kubernetes clusters in a production environment
  • Experience building scalable infrastructure in AWS with Terraform.
  • Strong knowledge of Protocol-level such as HTTP/HTTPS, SMTP, DNS, and LDAP
  • Experience monitoring production systems
  • Expertise in leveraging Automation / DevOps principles, experience with operational tools, and able to apply best practices for infrastructure and software deployment (Ansible).
  • HAProxy, Nginx, SSH, MySQL configuration and operation experience
  • Ability to work seamlessly with software developers, QA, project managers, and business development
  • Ability to produce and maintain written documentation


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Hanisha Pralayakaveri
Posted by Hanisha Pralayakaveri
Bengaluru (Bangalore), Mumbai
5 - 9 yrs
Best in industry
skill iconPython
skill iconAmazon Web Services (AWS)
PySpark
Data engineering

Job Description: Data Engineer 

Position Overview:

Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.

· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).

· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.

· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.

· Ensure data quality and consistency by implementing validation and governance practices.

· Work on data security best practices in compliance with organizational policies and regulations.

· Automate repetitive data engineering tasks using Python scripts and frameworks.

· Leverage CI/CD pipelines for deployment of data workflows on AWS.

Read more
The Alter Office

at The Alter Office

2 candid answers
Harsha Ravindran
Posted by Harsha Ravindran
Bengaluru (Bangalore)
1 - 4 yrs
₹6L - ₹10L / yr
skill iconNodeJS (Node.js)
MySQL
SQL
skill iconMongoDB
skill iconExpress
+9 more

Job Title: Backend Developer

Location: In-Office, Bangalore, Karnataka, India


Job Summary:

We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.


Annual Compensation: 6-10 LPA


Responsibilities:

  • Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
  • Architect and implement complex backend solutions, ensuring high availability and performance.
  • Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
  • Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
  • Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
  • Implement and enforce best practices for code quality, security, and performance optimization.
  • Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
  • Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
  • Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
  • Conduct system design reviews and contribute to architectural discussions.
  • Stay updated with industry trends and emerging technologies to drive innovation within the team.
  • Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
  • Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.


Requirements:

  • Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
  • Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
  • Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
  • Practical experience with Redis and caching mechanisms to enhance application performance.
  • Proficient in RESTful API design and development, with a strong understanding of API security best practices.
  • In-depth knowledge of asynchronous programming and event-driven architecture.
  • Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
  • Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
  • Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.
Read more
TechMynd Consulting

at TechMynd Consulting

2 candid answers
Suraj N
Posted by Suraj N
Bengaluru (Bangalore), Gurugram, Mumbai
4 - 8 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconPostgreSQL
skill iconPython
Apache
skill iconAmazon Web Services (AWS)
+5 more

Senior Data Engineer


Location: Bangalore, Gurugram (Hybrid)


Experience: 4-8 Years


Type: Full Time | Permanent


Job Summary:


We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.


This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.


Key Responsibilities:


PostgreSQL & Data Modeling


· Design and optimize complex SQL queries, stored procedures, and indexes


· Perform performance tuning and query plan analysis


· Contribute to schema design and data normalization


Data Migration & Transformation


· Migrate data from multiple sources to cloud or ODS platforms


· Design schema mapping and implement transformation logic


· Ensure consistency, integrity, and accuracy in migrated data


Python Scripting for Data Engineering


· Build automation scripts for data ingestion, cleansing, and transformation


· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)


· Maintain reusable script modules for operational pipelines


Data Orchestration with Apache Airflow


· Develop and manage DAGs for batch/stream workflows


· Implement retries, task dependencies, notifications, and failure handling


· Integrate Airflow with cloud services, data lakes, and data warehouses


Cloud Platforms (AWS / Azure / GCP)


· Manage data storage (S3, GCS, Blob), compute services, and data pipelines


· Set up permissions, IAM roles, encryption, and logging for security


· Monitor and optimize cost and performance of cloud-based data operations


Data Marts & Analytics Layer


· Design and manage data marts using dimensional models


· Build star/snowflake schemas to support BI and self-serve analytics


· Enable incremental load strategies and partitioning


Modern Data Stack Integration


· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka


· Support modular pipeline design and metadata-driven frameworks


· Ensure high availability and scalability of the stack


BI & Reporting Tools (Power BI / Superset / Supertech)


· Collaborate with BI teams to design datasets and optimize queries


· Support development of dashboards and reporting layers


· Manage access, data refreshes, and performance for BI tools




Required Skills & Qualifications:


· 4–6 years of hands-on experience in data engineering roles


· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)


· Advanced Python scripting skills for automation and ETL


· Proven experience with Apache Airflow (custom DAGs, error handling)


· Solid understanding of cloud architecture (especially AWS)


· Experience with data marts and dimensional data modeling


· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)


· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI


· Version control (Git) and CI/CD pipeline knowledge is a plus


· Excellent problem-solving and communication skills

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort