Cutshort logo
Python Jobs in Bangalore (Bengaluru)

50+ Python Jobs in Bangalore (Bengaluru) | Python Job openings in Bangalore (Bengaluru)

Apply to 50+ Python Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Pace Wisdom Solutions
Bengaluru (Bangalore)
2 - 5 yrs
₹5L - ₹12L / yr
Odoo (OpenERP)
skill iconPython
skill iconJavascript
skill iconHTML/CSS

Location: Bengaluru/Mangaluru 

Experience required: 2-5 years 

Key skills:  Odoo Development, Python, Frontend Technologies 

Designation: SE L1/L2/L3/ ATL 

 

Job Summary:  

We are seeking a skilled and proactive Odoo Developer to join our dynamic team. The ideal candidate will have hands-on experience in customizing, developing, and maintaining Odoo modules, with a deep understanding of Python and business processes. You will play a key role in requirement gathering, technical design, development, testing, and deployment.  


Key Responsibilities:  

  • Develop, customize, and maintain Odoo modules as per business requirements.  
  • Analyze, design, and develop new modules and features in Odoo ERP.  
  • Troubleshoot, debug, and upgrade existing Odoo modules.  
  • Integrate Odoo with third-party platforms using APIs/web services.  
  • Provide technical support and training to end-users.  
  • Collaborate with functional consultants and stakeholders to gather requirements and deliver scalable ERP solutions.  
  • Write clean, reusable, and efficient Python code and maintain technical documentation.  


Required Skills & Qualifications:  

  • 2-5 years of proven experience as an Odoo Developer.  
  • Strong knowledge of Python, PostgreSQL, and Odoo framework (ORM, QWeb, XML).  
  • Experience in Odoo custom module development and Odoo standard modules   
  • Good understanding of Odoo backend and frontend (JavaScript, HTML, CSS).  
  • Experience with Odoo APIs and web services (REST/SOAP).  
  • Familiarity with Linux environments, Git version control.  
  • Ability to work independently and in a team with minimal supervision.  
  • Good analytical and problem-solving skills.  
  • Strong verbal and written communication skills. Knowledge of Odoo deployment (Linux, Docker, Nginx, Odoo.sh) is a plus 

 

About the Company:   


Pace Wisdom Solutions is a deep-tech Product engineering and consulting firm. We have offices in San Francisco, Bengaluru, and Singapore. We specialize in designing and developing bespoke software solutions that cater to solving niche business problems.  


We engage with our clients at various stages:  


  • Right from the idea stage to scope out business requirements.  
  • Design & architect the right solution and define tangible milestones.  
  • Setup dedicated and on-demand tech teams for agile delivery.  
  • Take accountability for successful deployments to ensure efficient go-to-market Implementations. 
Read more
GK

at GK

ashok s
Posted by ashok s
Coimbatore, kerala, Bengaluru (Bangalore)
1 - 8 yrs
₹3.5L - ₹5L / yr
skill iconPython
skill iconC++
Cloud Computing
Big Data
skill iconHTML/CSS
+4 more


CSE grads can choose from a variety of impactful roles—ranging from hands-on technical positions to strategy-driven leadership, depending on expertise, interests, and projects. If you'd like to see sample resumes, skills-to-role mapping, or customized descriptions aligned with your specialization or career goals,

Read more
Intellikart Ventures LLP
ramandeep intellikart
Posted by ramandeep intellikart
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹25L / yr
Langchaing
langgraph
Linux kernel
LLMs
Prompt engineering
+3 more

Job Summary:

We are hiring a Data Scientist – Gen AI with hands-on experience in developing Agentic AI applications using frameworks like LangChain, LangGraph, Semantic Kernel, or Microsoft Copilot. The ideal candidate will be proficient in Python, LLMs, and prompt engineering techniques such as RAG and Chain-of-Thought prompting.


Key Responsibilities:

  • Build and deploy Agent AI applications using LLM frameworks.
  • Apply advanced prompt engineering (Zero-Shot, Few-Shot, CoT).
  • Integrate Retrieval-Augmented Generation (RAG).
  • Develop scalable solutions in Python using NumPy, Pandas, TensorFlow/PyTorch.
  • Collaborate with teams to deliver business-aligned Gen AI solutions.


Must-Have Skills:

  • Experience with LangChain, LangGraph, or similar (priority given).
  • Strong understanding of LLMs, RAG, and prompt engineering.
  • Proficiency in Python and relevant ML libraries.


Nice-to-Have:

  • Wrapper API development for LLMs.
  • REST API integration within Agentic workflows.


Qualifications:

  • Bachelor’s/Master’s in CS, Data Science, AI, or related.
  • 4–7 years in AI/ML/Data Science, with 1–2 years in Gen AI/LLMs.


Read more
Edstellar.com

at Edstellar.com

2 candid answers
partha Sarathy
Posted by partha Sarathy
Bengaluru (Bangalore)
0 - 0 yrs
₹3L - ₹3L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconPython
skill iconGit
Version Control
+3 more

Greetings from Edstellar

we are looking for Vibe Coder for entry Level


Position Overview

We're seeking passionate fresh graduates who are natural Vibe Coders - developers who code with intuition, creativity, and genuine enthusiasm for building amazing applications. Perfect for recent grads who bring fresh energy and innovative thinking to development.


Key Responsibilities

Build dynamic web and mobile applications with creative flair

Code with passion and embrace experimental approaches

Learn and implement emerging technologies rapidly

Collaborate in our innovation-friendly environment

Prototype ideas and iterate with speed and creativity

Bring fresh perspectives to development challenges


Required Qualifications

Education: Bachelor's in Computer Science/IT or related field

Experience: Fresh graduate (0-1 years)


Technical Skills:

Solid programming fundamentals (any language)

Basic web development (HTML, CSS, JavaScript)

Understanding of application development concepts

Familiarity with Git/version control

Creative problem-solving mindset


Preferred:

Good understanding in Python, JavaScript frameworks, or modern tech stack

AI tool familiarity

Mobile development interest

Open source contributions


Vibe Coder DNA

Passionate about coding and building innovative apps

Thrives with creative freedom and flexible approaches

Loves experimenting with new technologies

Values innovation and thinking outside the box

Natural curiosity and eagerness to learn

Collaborative spirit with independent drive

Resilient and adaptable to change



Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
IndArka Energy Pvt Ltd

at IndArka Energy Pvt Ltd

3 recruiters
Mita Hemant
Posted by Mita Hemant
Bengaluru (Bangalore)
3 - 4 yrs
₹20L - ₹25L / yr
skill iconPython
skill iconDjango
Data Structures
Algorithms

About us

Arka energy is focussed on changing the paradigm on energy. Arka focusses on creating innovative renewable energy solutions for residential customers. With its custom product design and an innovative approach to market the product solution, Arka aims to be a leading provider of energy solutions in the residential solar segment. Arka designs and develops end to end renewable energy solutions with teams in Bangalore and in the Bay area

This product is a 3d simulation software, to replicate rooftops/commercial sites, place solar panels and generate the estimation of solar energy.

What are we looking for?

·        As a backend developer you will be responsible for developing solutions that will enable Arka solutions to be easily adopted by customers.

·        Attention to detail and willingness to learn is a big part of this position.

·        Commitment to problem solving, and innovative design approaches are important.

Role and responsibilities

●       Develop cloud-based Python Django software products

●       Working closely with UX and Front-end Developers

●       Participating in architectural, design and product discussions Designing and creating RESTful APIs for internal and partner consumption

●       Working in an agile environment with an excellent team of engineers

●       Own/maintain code everything from development to fixing bugs/issues.

●       Deliver clean, reusable high-quality code

●       Facilitate problem diagnosis and resolution for issues reported by Customers

●       Deliver to schedule and timelines based on an Agile/Scrum-based approach

●       Develop new features and ideas to make product better and user centric.

●       Must be able to independently write code and test major features, as well as work jointly with other team members to deliver complex changes

●       Create algorithms from scratch and implement them in the software.

●       Code Review, End to End Unit Testing.

●       Guiding and monitoring Junior Engineers.

SKILL REQUIREMENTS

●       Solid database skills in a relational database (i.e. PostgresSQL, MySQL, etc.) Knowledge of how to build and use with RESTful APIs

●        Strong knowledge of version control (i.e. git, svn, etc.)

●        Experience deploying Python applications into production

●        Azure or Google cloud infrastructure knowledge is a plus

●       Strong drive to learn new technologies

●       Ability to learn new technologies quickly

●       Continuous look-out for new and creative solutions to implement new features or improve old ones

●       Data Structures, Algorithms, Django and Python

 

 

 

Good To have

·        Knowledge on GenAI Applications.

 

 

Key Benefits

·        Competitive development environment

·        Engagement into full scale systems development

·        Competitive Salary

·        Flexible working environment

·        Equity in an early-stage start-up

·        Patent Filing Bonuses

·        Health Insurance for Employee + Family

 

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Hyderabad, Bengaluru (Bangalore), Pune
6 - 11 yrs
₹8L - ₹26L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)

POSITION / TITLE: Data Science Lead

Location: Offshore – Hyderabad/Bangalore/Pune

Who are we looking for?

Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques. 

‎ The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.

We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.

Responsibilities:

· Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

· Perform data cleaning, feature engineering, and exploratory data analysis.  

· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

· Collaborate with other DS and engineers to deliver projects.

Technical Skills – Must have:

· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

· Ability to translate complex machine learning problem statements into specific deliverables and requirements.

· Should have worked with major cloud platforms such as AWS, Azure or GCP.

· Working knowledge of SQL and no-SQL databases.

· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

· Strong understanding of evaluation and monitoring metrics for machine learning projects.

Technical Skills – Good to have:

· Track record of getting ML models into production

· Experience building chatbots.

· Experience with closed and open source LLMs.

· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…

· Certifications or courses in data science.

Education:

· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field. 

Process Skills:

· Understanding of  Agile and Scrum  methodologies.  

· Ability to follow SDLC processes and contribute to technical documentation.  

Behavioral Skills :

· Self-motivated and capable of working independently with minimal management supervision.

· Well-developed design, analytical & problem-solving skills

· Excellent communication and interpersonal skills.  

· Excellent team player, able to work with virtual teams in several time zones.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Bhavya M
Posted by Bhavya M
Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Chef
skill iconPython

Key Responsibilities:

· Lead the design and implementation of scalable infrastructure using IaC principles.

· Develop and manage configuration management tools primarily with Chef.

· Write and maintain automation scripts in Python to streamline infrastructure tasks.

· Build, manage, and version infrastructure using Terraform.

· Collaborate with cloud architects and DevOps teams to ensure highly available, secure, and scalable systems.

· Provide guidance and mentorship to junior engineers.

· Monitor infrastructure performance and provide optimization recommendations.

· Ensure compliance with best practices for security, governance, and automation.

· Maintain and improve CI/CD pipelines with infrastructure integration.

· Support incident management, troubleshooting, and root cause analysis for infrastructure issues.


Required Skills & Experience:

· Strong hands-on experience in:

o Chef (Cookbooks, Recipes, Automation)

o Python (Scripting, automation tasks, REST APIs)

o Terraform (Modules, state management, deployments)

· Experience in AWS services (EC2, VPC, IAM, S3, etc.)

· Familiarity with Windows administration and automation.

· Solid understanding of CI/CD processes, infrastructure lifecycle, and Git-based workflow

Read more
5K head count IT Company, into digital transformation servic

5K head count IT Company, into digital transformation servic

Agency job
via B2TechServices by Suma Raju
Bengaluru (Bangalore), Chennai, Hyderabad
4 - 6 yrs
₹14L - ₹18L / yr
skill iconPython
Automation

Immediate Hiring: L2/L3 Network Protocol Test Engineers (Python Automation)

📍 Locations: Bangalore | Chennai | Hyderabad

🧑‍💻 Experience: 4+ Years


👨‍💻 Open Position:

L2/L3 Network Protocol Test Engineer

(Strong Python automation skills required)

✅ Requirements:

In-depth knowledge of L2/L3 protocols: Ethernet, VLAN, xSTP, OSPF, BGP, LACP

Hands-on experience with Python scripting for test automation

Experience with tools like IXIA, Spirent, or similar traffic generators

Strong skills in test planning, execution, and bug tracking

Excellent communication and team collaboration skills

🌟 Nice to Have:

Exposure to MPLS, EVPN, or other advanced networking protocols


Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai, Bengaluru (Bangalore)
1 - 6 yrs
₹15L - ₹30L / yr
MLOps
MLFlow
kubeflow
Windows Azure
skill iconMachine Learning (ML)
+4 more

About Moative

Moative, an Applied AI Services company, designs AI roadmaps, builds co-pilots and predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries. Through Moative Labs, we aspire to build micro-products and launch AI startups in vertical markets.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Role

We seek experienced ML/AI professionals with strong backgrounds in computer science, software engineering, or related elds to join our Azure-focused MLOps team. If you’re passionate about deploying complex machine learning models in real-world settings, bridging the gap between research and production, and working on high-impact projects, this role is for you.


Work you’ll do

As an operations engineer, you’ll oversee the entire ML lifecycle on Azure—spanning initial proofs-of-concept to large-scale production deployments. You’ll build and maintain automated training, validation, and deployment pipelines using Azure DevOps, Azure ML, and related services, ensuring models are continuously monitored, optimized for performance, and cost-eective. By integrating MLOps practices such as MLow and CI/CD, you’ll drive rapid iteration and experimentation. In close collaboration with senior ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade ML solutions that directly impact business outcomes. 


Responsibilities

  • ML-focused DevOps: Set up robust CI/CD pipelines with a strong emphasis on model versioning, automated testing, and advanced deployment strategies on Azure.
  • Monitoring & Maintenance: Track and optimize the performance of deployed models through live metrics, alerts, and iterative improvements.
  • Automation: Eliminate repetitive tasks around data preparation, model retraining, and inference by leveraging scripting and infrastructure as code (e.g., Terraform, ARM templates).
  • Security & Reliability: Implement best practices for securing ML workows on Azure, including identity/access management, container security, and data encryption.
  • Collaboration: Work closely with the data science teams to ensure model performance is within agreed SLAs, both for training and inference.


Skills & Requirements

  • 2+ years of hands-on programming experience with Python (PySpark or Scala optional).
  • Solid knowledge of Azure cloud services (Azure ML, Azure DevOps, ACI/AKS).
  • Practical experience with DevOps concepts: CI/CD, containerization (Docker, Kubernetes), infrastructure as code (Terraform, ARM templates).
  • Fundamental understanding of MLOps: MLow or similar frameworks for tracking and versioning.
  • Familiarity with machine learning frameworks (TensorFlow, PyTorch, XGBoost) and how to operationalize them in production.
  • Broad understanding of data structures and data engineering.


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, eiciency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less.


Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, x or improve – anything that isn’t done right, irrespective of who did it. Be selsh about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply here. We encourage you to apply even if you believe you do not meet all the requirements listed above.


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers.


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹16L / yr (Varies
)
Computer Vision
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython
Generative AI
+4 more

Role Overview

We are seeking a passionate and skilled Machine Learning Engineer to join our team. The ideal

candidate will have a strong background in machine learning, data science, and software engineering.

As a Machine Learning Engineer, you will work closely with our clients and internal teams to develop,

implement, and maintain machine learning models that solve real-world problems.


Must Have Skills

• 2+ years of experience into Computer vision and NLP projects.

• 2+ years of experience in machine learning and Gen AI, data science, or a related field.

• Strong experience in python programming

• Understanding of data structures, data modeling and software architecture

• Deep knowledge of math, probability, statistics and algorithms

• Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-

learn)

• Excellent communication skills

• Ability to work in a team

• Outstanding analytical and problem-solving skills

• BSc in Computer Science, Mathematics or similar field; Master’s degree is a plus


Role and Responsibilities

• Study and transform data science prototypes• Design machine learning systems

• Research and implement appropriate ML algorithms and tools

• Develop machine learning applications according to requirements

• Select appropriate datasets and data representation methods

• Run machine learning tests and experiments

• Perform statistical analysis and fine-tuning using test results

• Train and retrain systems when necessary

• Extend existing ML libraries and frameworks

• Keep abreast of developments in the field

Read more
Potentiam
karishma raj
Posted by karishma raj
Bengaluru (Bangalore)
6 - 12 yrs
₹22L - ₹30L / yr
skill iconPython
skill iconDjango

About Potentiam

Potentiam helps SME companies build world-class offshore teams. Our model is our locations and your dedicated staff under your control. Potentiam have offices in Iasi in Romania, Bangalore and Cape Town, home to large liquid pools of offshore talent working for international companies. Potentiam's management team have had over 15 years' experience in building offshore teams, and have specialist functional expertise to support the transition offshore in technology, finance, operations, engineering, digital marketing, technology and analytics. For decades corporations' scale has enabled them to benefit from the cost and skills advantage of offshore operations. Now SME companies can enjoy a similar benefit through Potentiam without any upfront investment.


Location : Bangalore ( Hybrid)


Experience - 6+ Years



Professional Experience:

  • Experience using a Python backend web framework (like Django, Flask or FastAPI)
  • In particular, experience building performant and reliable APIs and integrations
  • Competency using SQL and ORMs
  • Some experience with frontend web development would be a bonus using a JavaScript framework (such as Vue.js or React)
  • Understanding of some of the following: Django Rest Framework, PostgreSQL, Celery, Docker, nginx, AWS

Benefits and Perks

  • Health Insurance
  • Referral Bonus
  • Performance Bonus
  • Flexible Working options


Job Types: Full-time, Permanent


Read more
Potentiam
Dipanjan Das
Posted by Dipanjan Das
Bengaluru (Bangalore)
5 - 10 yrs
₹25L - ₹35L / yr
skill iconPython
machine Learning models
NumPy
skill iconDocker

● Proven experience in training, evaluating and deploying machine learning models

● Solid understanding of data science and machine learning concepts

● Experience with some machine learning / data engineering machine learning tech in Python (such as numpy, pytorch, pandas/polars, airflow, etc)

● Experience developing data products using large language model, prompt engineering, model evaluation.

● Experience with web services and programming (such as Python, docker, databases etc.)  

● Understanding of some of the following: FastAPI, PostgreSQL, Celery, Docker, AWS, Modal, git, continuous integration. 

Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹12L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
SQL
NOSQL Databases
+3 more

About Role

We are seeking a skilled Backend Engineer with 2+ years of experience to join our dynamic team, focusing on building scalable web applications using Python frameworks (Django/FastAPI) and cloud technologies. You'll be instrumental in developing and maintaining our cloud-native backend services.


Responsibilities:

  1. Design and develop scalable backend services using Django and FastAPI
  2. Create and maintain RESTful APIs
  3. Implement efficient database schemas and optimize queries
  4. Implement containerisation using Docker and container orchestration
  5. Design and implement cloud-native solutions using microservices architecture
  6. Participate in technical design discussions, code reviews and maintain coding standards
  7. Document technical specifications and APIs
  8. Collaborate with cross-functional teams to gather requirements, prioritise tasks, and contribute to project completion.

Requirements:

  1. Experience with Django and/or Fast-API (2+ years)
  2. Proficiency in SQL and ORM frameworks
  3. Docker containerisation and orchestration
  4. Proficiency in shell scripting (Bash/Power-Shell)
  5. Understanding of micro-services architecture
  6. Experience building server-less back end
  7. Knowledge of deployment and debugging on cloud platforms (AWS/Azure)
Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 8 yrs
₹6L - ₹14L / yr
skill iconMongoDB
skill iconDjango
WebSocket
skill iconRedux/Flux
SQL
+8 more

Role Overview

·        We are seeking a passionate and experienced Full Stack Developer skilled in MERN stack and Python (Django/Flask) to build and scale high-impact features across our web and mobile platforms. You will collaborate with cross-functional teams to deliver seamless user experiences and robust backend systems.

 

Key Responsibilities

·        Design, develop, and maintain scalable web applications using MySQL/Postgres, MongoDB, Express.js, React.js, and Node.js

·        Build and manage RESTful APIs and microservices using Python (Django/Flask/FastAPI)

·        Integrate with third-party platforms like OpenAI, WhatsApp APIs (Whapi), Interakt, and Zoho

·        Optimize performance across the frontend and backend

·        Collaborate with product managers, designers, and other developers to deliver high-quality features

·        Ensure security, scalability, and maintainability of code

·        Write clean, reusable, and well-documented code

·        Contribute to DevOps, CI/CD, and server deployment workflows (AWS/Lightsail)

·        Participate in code reviews and mentor junior developers if needed

 

Required Skills

·        Strong experience with MERN Stack: MongoDB, Express.js, React.js, Node.js

·        Proficiency in Python and web frameworks like Django, Flask, or FastAPI

·        Experience working with REST APIs, JWT/Auth, and WebSockets

·        Good understanding of frontend design systems, state management (Redux/Context), and responsive UI

·        Familiarity with database design and queries (MongoDB, PostgreSQL/MySQL)

·        Experience with Git, Docker, and deployment pipelines

·        Comfortable working in Linux-based environments (e.g., Ubuntu on AWS)

 

Bonus Skills

·        Experience with AI integrations (e.g., OpenAI, LangChain)

·        Familiarity with WooCommerce, WordPress APIs

·        Experience in chatbot development or WhatsApp API integration

 

Who You Are

·        You are a problem-solver with a product-first mindset

·        You care about user experience and performance

·        You enjoy working in a fast-paced, collaborative environment

·        You have a growth mindset and are open to learning new technologies

 

Why Join Us?

·        Work at the intersection of healthcare, community, and technology

·        Directly impact the lives of women across India and beyond

·        Flexible work environment and collaborative team

·        Opportunity to grow with a purpose-driven startup

Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 8 yrs
₹6L - ₹15L / yr
SQL
skill iconPython
skill iconData Analytics
Business Intelligence (BI)

Role Overview

We’re looking for a Data Analyst who is excited to work at the intersection of data, technology, and women’s wellness. You'll be instrumental in helping us understand user behaviour, community engagement, campaign performance, and product usage across platforms — including app, web, and WhatsApp.

You’ll also have opportunities to collaborate on AI-powered features such as chatbots and personalized recommendations. Experience with GenAI or NLP is a plus but not a requirement.

 

Key Responsibilities

·        Clean, transform, and analyse data from multiple sources (SQL databases, CSVs, APIs).

·        Build dashboards and reports to track KPIs, user behaviour, and marketing performance.

·        Collaborate with product, marketing, and customer teams to uncover actionable insights.

·        Support experiments, A/B testing, and cohort analysis to drive growth and retention.

·        Assist in documentation and communication of findings to technical and non-technical teams.

·        Work with the data team to enhance personalization and AI features (optional).

 

Required Qualifications

·        Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.

·        2 – 4 years of experience in data analysis or business intelligence.

·        Strong hands-on experience with SQL and Python (pandas, NumPy, matplotlib).

·        Familiarity with data visualization tools (Streamlit, Tableau, Metabase, Power BI, etc.)

·        Ability to translate complex data into simple visual stories and clear recommendations.

·        Strong attention to detail and a mindset for experimentation.

 

Preferred (Not Mandatory)

·        Exposure to GenAI, LLMs (e.g., OpenAI, HuggingFace), or NLP concepts.

·        Experience working with healthcare, wellness, or e-commerce datasets.

·        Familiarity with REST APIs, JSON structures, or chatbot systems.

·        Interest in building tools that impact women’s health and wellness. 


Why Join Us?

·        Be part of a high-growth startup tackling a real need in women’s healthcare.

·        Work with a passionate, purpose-driven team.

·        Opportunity to grow into GenAI/ML-focused roles as we scale.

·        Competitive salary and career progression

 

 

Best Regards,

Indrani Dutta

MIROR THERAPEUTICS PRIVATE LIMITED

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore)
8 - 12 yrs
Best in industry
skill iconPython
Terraform
Chef

Job Summary:

The Lead IaC Engineer will design, implement, automate, and maintain infrastructure across on-premises and cloud environments. This role should have strong hands-on expertise in Chef, Python, Terraform, and some AWS & Windows administration knowledge.


8-12 years of experience

Primary Skills – Chef, Python, and Terraform

Secondary – AWS & Windows admin (Cloud is not mandatory)

Read more
Trantor

at Trantor

1 recruiter
Nikita Sinha
Posted by Nikita Sinha
Remote, Bengaluru (Bangalore)
6 - 10 yrs
Upto ₹22L / yr (Varies
)
skill iconPython
SQL
CI/CD

We are looking for an experienced and detail-oriented Senior Performance Testing Engineer to join our QA team. The ideal candidate will be responsible for designing, developing, and executing scalable and reliable performance testing strategies. You will lead performance engineering initiatives using tools like Locust, Python, Docker, Kubernetes, and cloud-native environments (AWS), ensuring our systems meet performance SLAs under real-world usage patterns.


Key Responsibilities

  • Develop, enhance, and maintain Locust performance scripts using Python
  • Design realistic performance scenarios simulating real-world traffic and usage patterns
  • Parameterize and modularize scripts for robustness and reusability
  • Execute performance tests in containerized environments using Docker and Kubernetes
  • Manage performance test execution on Kubernetes clusters
  • Integrate performance tests into CI/CD pipelines in collaboration with DevOps and Development teams
  • Analyze performance test results, including throughput, latency, response time, and error rates
  • Identify performance bottlenecks, conduct root cause analysis, and suggest optimizations
  • Work with AWS (or other cloud platforms) to deploy, scale, and monitor tests in cloud-native environments
  • Write and optimize complex SQL queries, stored procedures, and perform DB performance testing
  • Work with SQL Server extensively; familiarity with Postgres is a plus
  • Develop and maintain performance testing strategies and test plans
  • Define and track KPIs, SLAs, workload models, and success criteria
  • Guide the team on best practices and promote a performance engineering mindset

Must-Have Qualifications

  • Proven hands-on experience with Locust and Python for performance testing
  • Working knowledge of microservices architecture
  • Hands-on with Kubernetes and Docker, especially in the context of running Locust at scale
  • Experience integrating performance tests in CI/CD pipelines
  • Strong experience with AWS or similar cloud platforms for deploying and scaling tests
  • Solid understanding of SQL Server, including tuning stored procedures and query optimization
  • Strong experience in performance test planning, execution, and analysis

Good-to-Have Skills

  • Exposure to Postgres DB
  • Familiarity with observability tools like Prometheus, Grafana, CloudWatch, and Datadog
  • Basic knowledge of APM (Application Performance Monitoring) tools
Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
FastAPI
skill iconFlask
API management
RESTful APIs
+8 more

Job Title : Python Developer – API Integration & AWS Deployment

Experience : 5+ Years

Location : Bangalore

Work Mode : Onsite


Job Overview :

We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.

The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.


Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.


Key Responsibilities :

Python Development & API Integration :

  • Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
  • Automate simulations and workflows using the PSS®E Python API (psspy).
  • Implement robust bulk case processing, result extraction, and automated reporting systems.


AWS Cloud Deployment :

  • Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
  • Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
  • Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.


Required Skills :

  • 5+ Years of professional experience in Python development.
  • Hands-on experience with RESTful API development (FastAPI/Flask).
  • Solid experience working with PSS®E and its psspy Python API.
  • Strong understanding of AWS services, deployment, and best practices.
  • Proficiency in automation, scripting, and report generation.
  • Knowledge of cloud security and monitoring tools like IAM and CloudWatch.

Good to Have :

  • Experience in power system simulation and electrical engineering concepts.
  • Familiarity with CI/CD tools for AWS deployments.
Read more
I-Stem

at I-Stem

2 candid answers
Sahil Garg
Posted by Sahil Garg
Bengaluru (Bangalore)
2 - 4 yrs
₹20L - ₹25L / yr
skill iconPython
PyTorch
TensorFlow
skill iconDocker
skill iconKubernetes
+2 more

You will:

  • Collaborate with the I-Stem Voice AI team and CEO to design, build and ship new agent capabilities
  • Develop, test and refine end-to-end voice agent models (ASR, NLU, dialog management, TTS)
  • Stress-test agents in noisy, real-world scenarios and iterate for improved robustness and low latency
  • Research and prototype cutting-edge techniques (e.g. robust speech recognition, adaptive language understanding)
  • Partner with backend and frontend engineers to seamlessly integrate AI components into live voice products
  • Monitor agent performance in production, analyze failure cases, and drive continuous improvement
  • Occasionally demo our Voice AI solutions at industry events and user forums


You are:

  • An AI/Software Engineer with hands-on experience in speech-centric ML (ASR, NLU or TTS)
  • Skilled in building and tuning transformer-based speech models and handling real-time audio pipelines
  • Obsessed with reliability: you design experiments to push agents to their limits and root-cause every error
  • A clear thinker who deconstructs complex voice interactions from first principles
  • Passionate about making voice technology inclusive and accessible for diverse users
  • Comfortable moving fast in a small team, yet dogged about code quality, testing and reproducibility


Read more
Appiness Interactive Pvt. Ltd.
S Suriya Kumar
Posted by S Suriya Kumar
Bengaluru (Bangalore)
4 - 8 yrs
₹4L - ₹12L / yr
skill iconPython
skill iconDjango
MySQL

Company Description

Appiness Interactive Pvt. Ltd. is a Bangalore-based product development and UX firm that

specializes in digital services for startups to fortune-500s. We work closely with our clients to

create a comprehensive soul for their brand in the online world, engaged through multiple

platforms of digital media. Our team is young, passionate, and aggressive, not afraid to think

out of the box or tread the un-trodden path in order to deliver the best results for our clients.


We pride ourselves on Practical Creativity where the idea is only as good as the returns it

fetches for our clients.

We are looking for an experienced Backend Developer with a strong foundation in Python,

Django, and MySQL to join our development team. The ideal candidate should have at least 4

years of hands-on experience building scalable, secure, and high-performing web applications

and APIs. You will play a critical role in developing server-side logic, managing database

operations, and ensuring optimal application performance.


Key Responsibilities:

● Design, develop, test, and maintain robust backend systems using Python and Django.

● Build RESTful APIs and integrate them with front-end components or third-party

systems.

● Design and optimize relational database schemas in MySQL.

● Write clean, maintainable, and efficient code following best practices.

● Optimize application performance and troubleshoot production issues.

● Ensure the security and data protection of applications.

● Collaborate with front-end developers, QA, DevOps, and product teams.

● Participate in code reviews and mentor junior developers (if applicable).


Required Skills:

● Strong programming skills in Python, with in-depth knowledge of the Django framework.

● Experience in designing, maintaining, and querying MySQL databases.

● Understanding of MVC design patterns and RESTful service architecture.

● Familiarity with Git version control.

● Knowledge of software development best practices, including unit testing and CI/CD.


Educational Qualifications:

Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent practical

experience).


Benefits

● Competitive salary and performance bonuses.

● Health insurance

● Opportunities for professional development and career growth.

Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Bengaluru (Bangalore), Mumbai
7 - 14 yrs
₹15L - ₹30L / yr
skill iconPython
AWS Lambda
skill iconDocker
API
S3
+4 more

We are seeking a highly skilled Python Backend Developer with strong experience in building Microservices-based architectures and cloud-native server less solutions on AWS. The ideal candidate will be responsible for designing, developing, and maintaining scalable backend systems, ensuring high performance and responsiveness to requests from front-end applications and third-party systems.

 

Key Responsibilities:

  • Design and develop robust backend services and RESTful APIs using Python (FastAPI, Flask, or Django)
  • Build and deploy microservices that are scalable, loosely coupled, and independently deployable
  • Develop and manage serverless applications using AWS LambdaAPI GatewayDynamoDBS3SNSSQS, and Step Functions
  • Implement event-driven architectures and data processing pipelines
  • Collaborate with front-end developers, DevOps, and product teams to deliver high-quality software
  • Ensure code quality through unit testingintegration testing, and code reviews
  • Automate deployments using CI/CD pipelines and Infrastructure as Code (IaC) tools like CloudFormation or Terraform
  • Monitor, debug, and optimize backend systems for performance and scalability

 

Required Skills & Experience:

  • 7+ years of backend development experience using Python
  • Strong experience in designing and implementing microservices
  • Hands-on experience with AWS Serverless services: Lambda, API Gateway, S3, DynamoDB, SQS, SNS, etc.
  • Proficient in RESTful API design, JSON, and OpenAPI/Swagger specifications
  • Experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI)
  • Knowledge of CI/CD tools (e.g., GitHub Actions, Jenkins, CodePipeline)
  • Familiarity with Docker and containerized deployments
  • Strong understanding of software design patterns, clean code practices, and Agile methodologies

 

Nice to Have:

  • Experience with GraphQL or gRPC
  • Exposure to monitoring/logging tools (e.g., CloudWatch, ELK, Prometheus)
  • Knowledge of security best practices in API and cloud development
  • Familiarity with data streaming using Kafka or Kinesis


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sonali RajeshKumar
Posted by Sonali RajeshKumar
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Databases
SQL
IBM DB2
skill iconPython

Job Description: 

Years of Experience:- 5-8 Years

Location: Bangalore


Job Role:- Database Developer


Primary Skill - Database, SQL


Secondary skill - DB2 and Python


Skills:


Main Pointers for Database Developer role.


*Should have strong working experience on any Database like DB2(Good to Have) and SQL OR Oracle/ PL SQL etc.

*Should have working experience on performance tuning

Read more
Indee

at Indee

2 candid answers
1 recruiter
Nikita Sinha
Posted by Nikita Sinha
Remote, Bengaluru (Bangalore)
5yrs+
Upto ₹22L / yr (Varies
)
Selenium
skill iconPython
Manual testing
cypress
Test Automation (QA)
+2 more

Must-Have Skills & Qualifications:

  • Bachelor's degree in Engineering (Computer Science, IT, or related field)
  • 5–6 years of experience in manual testing of web and mobile applications
  • Working knowledge of test automation tools: Selenium
  • Experience with API testing using tools like Postman or equivalent
  • Experience with BDD
  • Strong understanding of test planning, test case design, and defect tracking processes
  • Experience leading QA for projects and production releases
  • Familiarity with Agile/Scrum methodologies
  • Effective collaboration skills – able to work with cross-functional teams and contribute to automation efforts as needed

Good-to-Have Skills:

  • Familiarity with CI/CD pipelines and version control tools (Git, Jenkins)
  • Exposure to performance or security testing
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Hyderabad
4 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
Data engineering
skill iconAmazon Web Services (AWS)
RESTful APIs
Microservices
+9 more

Job Title : Python Data Engineer

Experience : 4+ Years

Location : Bangalore / Hyderabad (On-site)


Job Summary :

We are seeking a skilled Python Data Engineer to work on cloud-native data platforms and backend services.

The role involves building scalable APIs, working with diverse data systems, and deploying containerized services using modern cloud infrastructure.


Mandatory Skills : Python, AWS, RESTful APIs, Microservices, SQL/PostgreSQL/NoSQL, Docker, Kubernetes, CI/CD (Jenkins/GitLab CI/AWS CodePipeline)


Key Responsibilities :

  • Design, develop, and maintain backend systems using Python.
  • Build and manage RESTful APIs and microservices architectures.
  • Work extensively with AWS cloud services for deployment and data storage.
  • Implement and manage SQL, PostgreSQL, and NoSQL databases.
  • Containerize applications using Docker and orchestrate with Kubernetes.
  • Set up and maintain CI/CD pipelines using Jenkins, GitLab CI, or AWS CodePipeline.
  • Collaborate with teams to ensure scalable and reliable software delivery.
  • Troubleshoot and optimize application performance.


Must-Have Skills :

  • 4+ years of hands-on experience in Python backend development.
  • Strong experience with AWS cloud infrastructure.
  • Proficiency in building microservices and APIs.
  • Good knowledge of relational and NoSQL databases.
  • Experience with Docker and Kubernetes.
  • Familiarity with CI/CD tools and DevOps processes.
  • Strong problem-solving and collaboration skills.
Read more
A leader in telecom, fintech, AI-led marketing automation.

A leader in telecom, fintech, AI-led marketing automation.

Agency job
via Infinium Associate by Toshi Srivastava
Bengaluru (Bangalore)
9 - 15 yrs
₹25L - ₹35L / yr
MERN Stack
skill iconPython
skill iconMongoDB
Spark
Hadoop
+7 more

We are looking for a talented MERN Developer with expertise in MongoDB/MySQL, Kubernetes, Python, ETL, Hadoop, and Spark. The ideal candidate will design, develop, and optimize scalable applications while ensuring efficient source code management and implementing Non-Functional Requirements (NFRs).


Key Responsibilities:

  • Develop and maintain robust applications using MERN Stack (MongoDB, Express.js, React.js, Node.js).
  • Design efficient database architectures (MongoDB/MySQL) for scalable data handling.
  • Implement and manage Kubernetes-based deployment strategies for containerized applications.
  • Ensure compliance with Non-Functional Requirements (NFRs), including source code management, development tools, and security best practices.
  • Develop and integrate Python-based functionalities for data processing and automation.
  • Work with ETL pipelines for smooth data transformations.
  • Leverage Hadoop and Spark for processing and optimizing large-scale data operations.
  • Collaborate with solution architects, DevOps teams, and data engineers to enhance system performance.
  • Conduct code reviews, troubleshooting, and performance optimization to ensure seamless application functionality.


Required Skills & Qualifications:

  • Proficiency in MERN Stack (MongoDB, Express.js, React.js, Node.js).
  • Strong understanding of database technologies (MongoDB/MySQL).
  • Experience working with Kubernetes for container orchestration.
  • Hands-on knowledge of Non-Functional Requirements (NFRs) in application development.
  • Expertise in Python, ETL pipelines, and big data technologies (Hadoop, Spark).
  • Strong problem-solving and debugging skills.
  • Knowledge of microservices architecture and cloud computing frameworks.

Preferred Qualifications:

  • Certifications in cloud computing, Kubernetes, or database management.
  • Experience in DevOps, CI/CD automation, and infrastructure management.
  • Understanding of security best practices in application development.


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore)
4 - 8 yrs
₹7L - ₹24L / yr
skill iconPython
NumPy
pandas
skill iconMachine Learning (ML)

·        Develop and maintain scalable back-end applications using Python frameworks such as Flask/Django/FastAPI.

·        Design, build, and optimize data pipelines for ETL processes using tools like PySpark, Airflow, and other similar technologies.

·        Work with relational and NoSQL databases to manage and process large datasets efficiently.

Collaborate with data scientists to clean, transform, and prepare data for analytics and machine learning models.

Work in a dynamic environment, at the intersection of software development and data engineering.

Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Bengaluru (Bangalore)
6 - 9 yrs
₹15L - ₹30L / yr
skill iconNodeJS (Node.js)
Relational Database (RDBMS)
skill iconReact.js
skill iconAngular (2+)
SQL
+8 more

Role overview


  • Overall 5 to 7 years of experience. Node.js experience is must.
  • At least 3+ years of experience or couple of large-scale products delivered on microservices.
  • Strong design skills on microservices and AWS platform infrastructure.
  • Excellent programming skill in Python, Node.js and Java.
  • Hands on development in rest API’s.
  • Good understanding of nuances of distributed systems, scalability, and availability.


What would you do here


  • To Work as a Backend Developer in developing Cloud Web Applications
  • To be part of the team working on various types of web applications related to Mortgage Finance.
  • Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
  • You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
  • Experienced in Relational and No-SQL databases and scalable design.
  • Experience in solving challenging problems by developing elegant, maintainable code.
  • Delivered rapid iterations of software based on user feedback and metrics.
  • Help the team make key decisions on our product and technology direction.
  • You actively contribute to the adoption of frameworks, standards, and new technologies.
Read more
PGAGI
Javeriya Shaik
Posted by Javeriya Shaik
Remote, Bengaluru (Bangalore)
2 - 3 yrs
₹6L - ₹7L / yr
Artificial Intelligence (AI)
Large Language Models (LLM) tuning
Retrieval Augmented Generation (RAG)
skill iconPython
Natural Language Processing (NLP)
+1 more

Key Responsibilities

  • Experience working with python, LLM, Deep Learning, NLP, etc..
  • Utilize GitHub for version control, including pushing and pulling code updates.
  • Work with Hugging Face and OpenAI platforms for deploying models and exploring open-source AI models.
  • Engage in prompt engineering and the fine-tuning process of AI models.

Requirements

  • Proficiency in Python programming.
  • Experience with GitHub and version control workflows.
  • Familiarity with AI platforms such as Hugging Face and OpenAI.
  • Understanding of prompt engineering and model fine-tuning.
  • Excellent problem-solving abilities and a keen interest in AI technology.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore)
5 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
Generative AI
Langchain
Streamlit
Large Language Models (LLM)
+4 more

Role : Gen AI Developer / Engineer

Location : Bangalore

Experience Required : 6+ Years

Work Mode : Hybrid (2–3 days from office per week)

Contract Duration : 12 Months

Must-Have Skills :

  • Python, Gen AI, Langchain, Streamlit, LLMs.
  • Strong experience building AI/ML-based applications.
  • 2+ years of hands-on experience with LLM development.
  • Solid understanding of RAG (Retrieval-Augmented Generation), embeddings, and LLM training.
  • Proficiency in prompt engineering.
  • Hands-on experience with Azure services: Azure Search, App Services, API Management, Cosmos DB.
  • Familiarity with Azure cloud infrastructure.
  • Basic knowledge of front-end technologies like React.
  • Understanding of software engineering best practices including Git, testing, and CI/CD pipelines.
Read more
hirezyai
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 10 yrs
₹12L - ₹25L / yr
AgaroCD
skill iconKubernetes
skill iconDocker
helm
Terraform
+9 more

Job Summary:

We are seeking a skilled DevOps Engineer to design, implement, and manage CI/CD pipelines, containerized environments, and infrastructure automation. The ideal candidate should have hands-on experience with ArgoCD, Kubernetes, and Docker, along with a deep understanding of cloud platforms and deployment strategies.

Key Responsibilities:

  • CI/CD Implementation: Develop, maintain, and optimize CI/CD pipelines using ArgoCD, GitOps, and other automation tools.
  • Container Orchestration: Deploy, manage, and troubleshoot containerized applications using Kubernetes and Docker.
  • Infrastructure as Code (IaC): Automate infrastructure provisioning with Terraform, Helm, or Ansible.
  • Monitoring & Logging: Implement and maintain observability tools like Prometheus, Grafana, ELK, or Loki.
  • Security & Compliance: Ensure best security practices in containerized and cloud-native environments.
  • Cloud & Automation: Manage cloud infrastructure on AWS, Azure, or GCP with automated deployments.
  • Collaboration: Work closely with development teams to optimize deployments and performance.

Required Skills & Qualifications:

  • Experience: 5+ years in DevOps, Site Reliability Engineering (SRE), or Infrastructure Engineering.
  • Tools & Tech: Strong knowledge of ArgoCD, Kubernetes, Docker, Helm, Terraform, and CI/CD pipelines.
  • Cloud Platforms: Experience with AWS, GCP, or Azure.
  • Programming & Scripting: Proficiency in Python, Bash, or Go.
  • Version Control: Hands-on with Git and GitOps workflows.
  • Networking & Security: Knowledge of ingress controllers, service mesh (Istio/Linkerd), and container security best practices.

Nice to Have:

  • Experience with Kubernetes Operators, Kustomize, or FluxCD.
  • Exposure to serverless architectures and multi-cloud deployments.
  • Certifications in CKA, AWS DevOps, or similar.


Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Pune, Kolkata
4 - 6 yrs
₹7L - ₹24L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
NumPy
pandas

Key Technical Skillsets-

  • Design, develop, and maintain scalable applications using AWS services, Python, and Boto3.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement best practices for cloud architecture and application development.
  • Optimize applications for maximum speed and scalability.
  • Troubleshoot and resolve issues in development, test, and production environments.
  • Write clean, maintainable, and efficient code.
  • Participate in code reviews and contribute to team knowledge sharing.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sruthy VS
Posted by Sruthy VS
Bengaluru (Bangalore), Mumbai
4 - 8 yrs
Best in industry
Snow flake schema
ETL
SQL
skill iconPython
  • Strong Snowflake Cloud database experience Database developer.
  • Knowledge of Spark and Databricks is desirable.
  • Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture 
  • Familiar with technologies relevant to data lakes such as Snowflake
  • Candidate should have strong ETL & database design/modelling skills. 
  • Experience creating data pipelines
  • Strong SQL skills and debugging knowledge and Performance Tuning exp.
  • Experience with Databricks / Azure is add on /good to have . 
  • Experience working with global teams and global application environments
  • Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired

 

 

 

Read more
HeyCoach
DeepanRaj R
Posted by DeepanRaj R
Bengaluru (Bangalore)
4 - 12 yrs
₹0.1L - ₹0.1L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconReact.js
Data Structures
Natural Language Processing (NLP)
+5 more


Tech Lead(Fullstack) – Nexa (Conversational Voice AI Platform)

Location: Bangalore Type: Full-time

Experience: 4+ years (preferably in early-stage startups)

Tech Stack: Python (core), Node.js, React.js

 

 

About Nexa

Nexa is a new venture by the founders of HeyCoachPratik Kapasi and Aditya Kamat—on a mission to build the most intuitive voice-first AI platform. We’re rethinking how humans interact with machines using natural, intelligent, and fast conversational interfaces.

We're looking for a Tech Lead to join us at the ground level. This is a high-ownership, high-speed role for builders who want to move fast and go deep.

 

What You’ll Do

●     Design, build, and scale backend and full-stack systems for our voice AI engine

●     Work primarily with Python (core logic, pipelines, model integration), and support full-stack features using Node.js and React.js

●     Lead projects end-to-end—from whiteboard to production deployment

●     Optimize systems for performance, scale, and real-time processing

●     Collaborate with founders, ML engineers, and designers to rapidly prototype and ship features

 ●     Set engineering best practices, own code quality, and mentor junior team members as we grow

 

✅ Must-Have Skills

●     4+ years of experience in Python, building scalable production systems

●     Has led projects independently, from design through deployment

●     Excellent at executing fast without compromising quality

●     Strong foundation in system design, data structures and algorithms

●     Hands-on experience with Node.js and React.js in a production setup

●     Deep understanding of backend architecture—APIs, microservices, data flows

●     Proven success working in early-stage startups, especially during 0→1 scaling phases

●     Ability to debug and optimize across the full stack

●     High autonomy—can break down big problems, prioritize, and deliver without hand-holding

  

🚀 What We Value

●     Speed > Perfection: We move fast, ship early, and iterate

●     Ownership mindset: You act like a founder-even if you're not one

●     Technical depth: You’ve built things from scratch and understand what’s under the hood

●     Product intuition: You don’t just write code—you ask if it solves the user’s problem

●     Startup muscle: You’re scrappy, resourceful, and don’t need layers of process

●     Bias for action: You unblock yourself and others. You push code and push thinking

Humility and curiosity

: You challenge ideas, accept better ones, and never stop learning

 

💡 Nice-to-Have

●     Experience with NLP, speech interfaces, or audio processing

●     Familiarity with cloud platforms (GCP/AWS), CI/CD, Docker, Kubernetes

●     Contributions to open-source or technical blogs

●     Prior experience integrating ML models into production systems

 

Why Join Nexa?

●     Work directly with founders on a product that pushes boundaries in voice AI

●     Be part of the core team shaping product and tech from day one

●     High-trust environment focused on output and impact, not hours

●     Flexible work style and a flat, fast culture

Read more
Nutanix

at Nutanix

2 recruiters
Namrata Panda
Posted by Namrata Panda
Bengaluru (Bangalore)
9 - 17 yrs
₹10L - ₹40L / yr
Test Automation (QA)
skill iconPython
DS
Algorithms

Hungry, Humble, Honest, with Heart.

 

The Opportunity

We are looking for a Senior Member of Technical Staff QA with extensive experience in automation development, strategy, and testing skills to be part of our dynamic team.

 

About the Team

At Identity & Access Management, we are trying to build the next-generation IAM to help enterprises & internal clients.

 

Your Role

As the gatekeeper of our product quality, you would be required to ensure that product releases adhere to the highest quality norms. You will design and develop a testing framework for our products and be part of a ruthless quality team. You will develop test tools, test cases, maintain test beds, and provide metrics and test/quality status.



What You Will Bring

  • Be a self-starter who can flourish in a fast-paced technology company
  • Be the owner of our automation and delivery systems
  • Be actively participating in Coding & Code reviews
  • Have expertise with automation and building tools such as Selenium, Jenkins, REST API, SDK
  • Strong systems background (Linux systems with exposure to Process, Memory and IO Management tools
  • Strong Coding skills (DS, Algos).
  • You will enable our engineers to generate known-quality releases with every commit, discover defects early and iterate fast.
  • Good understanding of Kubernetes 
  • Good experience in building & maintaining an Automation Framework
  • Excellent programming and scripting capabilities to develop code for automated tests using Python/C++/Perl/Go/Java Lang
  • Min 9 - 17 years of relevant experience 


 

Work Arrangement

Hybrid: This role operates in a hybrid capacity, blending the benefits of remote work with the advantages of in-person collaboration. For most roles, that will mean coming into an office a minimum of 2 - 3 days per week, however certain roles and/or teams may require more frequent in-office presence. Additional team-specific guidance and norms will be provided by your manager.

 

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Hyderabad, Bengaluru (Bangalore)
3 - 8 yrs
₹5L - ₹20L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
AI
ML
skill iconPython

Desired Competencies (Technical/Behavioral Competency)

Must-Have

  1. Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc.
  2. Deep Learning Frameworks: PyTorch, spaCy, Keras
  3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers
  4. Experience in working with Image processing, computer vision is must
  5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,kerasetc.,.) , LangChain, Flask,FastAPI, prompt engineering.
  6. Programming experience in Python
  7. Strong written and verbal communications
  8. Excellent interpersonal and collaboration skills.

Role descriptions / Expectations from the Role

Design and implement scalable and efficient data architectures to support generative AI workflows.

Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models

Apply prompt engineer techniques as required by the use case

Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks

Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Hyderabad
3 - 10 yrs
₹6L - ₹25L / yr
Gen AI
NLP
skill iconPython
TensorFlow
skill iconMachine Learning (ML)
+4 more

Desired Competencies (Technical/Behavioral Competency)

Must-Have

  1. Hands-on knowledge in machine learning, deep learning, TensorFlow, Python, NLP
  2. Stay up to date on the latest AI emergences relevant to the business domain.
  3. Conduct research and development processes for AI strategies.

4.     Experience developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs.

5.     Experience with transformer models such as BERT, GPT, RoBERTa, etc, and a solid understanding of their underlying principles is a plus

Good-to-Have

  1. Have knowledge of software development methodologies, such as Agile or Scrum
  2. Have strong communication skills, with the ability to effectively convey complex technical concepts to a diverse audience.
  3. Have experience with natural language processing (NLP) techniques and tools, such as SpaCy, NLTK, or Hugging Face
  4. Ensure the quality of code and applications through testing, peer review, and code analysis.
  5. Root cause analysis and bugs correction
  6. Familiarity with version control systems, preferably Git.
  7. Experience with building or maintaining cloud-native applications.
  8. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is Plus
Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Hyderabad, Bengaluru (Bangalore)
3 - 8 yrs
₹7L - ₹24L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
Natural Language Processing (NLP)
NumPy
+1 more

Design and implement scalable and efficient data architectures to support generative AI workflows.

2 Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models

3 Apply prompt engineer techniques as required by the use case

4 Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks

5 Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, PAn india
5 - 10 yrs
₹10L - ₹25L / yr
Test Automation
Selenium
skill iconJava
skill iconPython
skill iconJavascript

Test Automation Engineer Job Description

A Test Automation Engineer is responsible for designing, developing, and implementing automated testing solutions to ensure the quality and reliability of software applications. Here's a breakdown of the job:


Key Responsibilities

- Test Automation Framework: Design and develop test automation frameworks using tools like Selenium, Appium, or Cucumber.

- Automated Test Scripts: Create and maintain automated test scripts to validate software functionality, performance, and security.

- Test Data Management: Develop and manage test data, including data generation, masking, and provisioning.

- Test Environment: Set up and maintain test environments, including configuration and troubleshooting.

- Collaboration: Work with cross-functional teams, including development, QA, and DevOps to ensure seamless integration of automated testing.


Essential Skills

- Programming Languages: Proficiency in programming languages like Java, Python, or C#.

- Test Automation Tools: Experience with test automation tools like Selenium,.

- Testing Frameworks: Knowledge of testing frameworks like TestNG, JUnit, or PyUnit.

- Agile Methodologies: Familiarity with Agile development methodologies and CI/CD pipelines.

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Chennai, Kochi (Cochin)
6 - 9 yrs
₹7L - ₹15L / yr
skill iconAmazon Web Services (AWS)
sagemaker
skill iconMachine Learning (ML)
skill iconDocker
skill iconPython
  • Design, develop, and maintain data pipelines and ETL workflows on AWS platform
  • Work with AWS services like S3, Glue, Lambda, Redshift, EMR, and Athena for data ingestion, transformation, and analytics
  • Collaborate with Data Scientists, Analysts, and Business teams to understand data requirements
  • Optimize data workflows for performance, scalability, and reliability
  • Troubleshoot data issues, monitor jobs, and ensure data quality and integrity
  • Write efficient SQL queries and automate data processing tasks
  • Implement data security and compliance best practices
  • Maintain technical documentation and data pipeline monitoring dashboards
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
Alpha

at Alpha

2 candid answers
Yash Makhecha
Posted by Yash Makhecha
Remote, Bengaluru (Bangalore)
1 - 6 yrs
₹4L - ₹12L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconReact.js
TypeScript
skill iconDocker
+10 more

Full Stack Engineer

Location: Remote (India preferred) · Type: Full-time · Comp: Competitive salary + early-stage stock



About Alpha

Alpha is building the simplest way for anyone to create AI agents that actually get work done. Our platform turns messy prompt chaining, data schemas, and multi-tool logic into a clean, no-code experience. We’re backed, funded, and racing toward our v1 launch. Join us on the ground floor and shape the architecture, the product, and the culture.



The Role

We’re hiring two versatile full-stack engineers. One will lean infra/back-end, the other front-end/LLM integration, but both will ship vertical slices end-to-end.


You will:

  • Design and build the agent-execution runtime (LLMs, tools, schemas).
  • Stand up secure VPC deployments with Docker, Terraform, and AWS or GCP.
  • Build REST/GraphQL APIs, queues, Postgres/Redis layers, and observability.
  • Create a React/Next.js visual workflow editor with drag-and-drop blocks.
  • Build the Prompt Composer UI, live testing mode, and cost dashboard.
  • Integrate native tools: search, browser, CRM, payments, messaging, and more.
  • Ship fast—design, code, test, launch—and own quality (no separate QA team).
  • Talk to early users and fold feedback into weekly releases.



What We’re Looking For


  • 3–6 years building production web apps at startup pace.
  • Strong TypeScript + Node.js or Python.
  • Solid React/Next.js and modern state management.
  • Comfort with AWS or GCP, Docker, and CI/CD.
  • Bias for ownership from design to deploy.


Nice but not required: Terraform or CDK, IAM/VPC networking, vector DBs or RAG pipelines, LLM API experience, React-Flow or other canvas libs, GraphQL or event streaming, prior dev-platform work.


We don’t expect every box ticked—show us you learn fast and ship.



What You’ll Get


• Meaningful equity at the earliest stage.

• A green-field codebase you can architect the right way.

• Direct access to the founder—instant decisions, no red tape.

• Real customers from day one; your code goes live, not to backlog.

• Stipend for hardware, LLM credits, and professional growth.



Come build the future of work—where AI agents handle the busywork and people do the thinking.

Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹30L / yr
cicd
skill iconC
skill iconPython
skill iconJenkins
skill iconGitHub
+4 more

Role Summary :

We are seeking a skilled and detail-oriented SRE Release Engineer to lead and streamline the CI/CD pipeline for our C and Python codebase. You will be responsible for coordinating, automating, and validating biweekly production releases, ensuring operational stability, high deployment velocity, and system reliability.


Key Responsibilities :

● Own the release process: Plan, coordinate, and execute biweekly software releases across multiple services.

● Automate release pipelines: Build and maintain CI/CD workflows using tools such as GitHub Actions, Jenkins, or GitLab CI.

● Version control: Manage and enforce Git best practices, branching strategies (e.g., Git Flow), tagging, and release versioning.

● Integrate testing frameworks: Ensure automated test coverage (unit, integration, regression) is enforced pre-release.

● Release validation: Develop pre-release verification tools/scripts to validate build integrity and backward compatibility.

● Deployment strategy: Implement and refine blue/green, rolling, or canary deployments in staging and production environments.

● Incident readiness: Partner with SREs to ensure rollback strategies, monitoring, and alerting are release-aware.

● Collaboration: Work closely with developers, QA, and product teams to align on release timelines and feature readiness. 


Required Qualifications

● Bachelor’s degree in Computer Science, Engineering, or related field. ● 3+ years in SRE, DevOps, or release engineering roles.

● Proficiency in CI/CD tooling (e.g., GitHub Actions, Jenkins, GitLab).

● Experience automating deployments for C and Python applications.

● Strong understanding of Git version control, merge/rebase strategies, tagging, and submodules (if used).

● Familiarity with containerization (Docker) and deployment orchestration (e.g., Kubernetes, Ansible, or Terraform).

● Solid scripting experience (Python, Bash, or similar). ● Understanding of observability, monitoring, and incident response tooling (e.g., Prometheus, Grafana, ELK, Sentry).


Preferred Skills

● Experience with release coordination in data networking environments ● Familiarity with build tools like Make, CMake, or Bazel.

● Exposure to artifact management systems (e.g., Artifactory, Nexus).

● Experience deploying to Linux production systems with service uptime guarantees.  

Read more
Draup
suchanya p
Posted by suchanya p
Bengaluru (Bangalore)
1 - 4 yrs
₹10L - ₹25L / yr
skill iconPython
skill iconJava
SQL

We are looking for:

• 2+ years of expertise in software development with one or more of the general programming languages (e.g., Python, Java, C/C++, Go). Experience in Python and Django is recommended.

• Deep understanding of how to build an application with optimized RESTful APIs.

• Knowledge of a web framework like Django or similar with ORM or multi-tier, multi-DB-based data-heavy web application development will help your profile stand out.

• Knowledge of Gen AI tools and technologies is a plus.

• Sound knowledge of SQL queries & DB like PostgreSQL(must) or MySQL. Working knowledge of NoSQL DBs (Elasticsearch, Mongo, Redis, etc.) is a plus.

• Knowledge of graph DB like Neo4j or AWS Neptune adds extra credits to your profile.

• Knowing queue-based messaging frameworks like Celery, RQ, Kafka, etc., and distributed system understanding will be advantageous.

• Understands a programming language's limitations to exploit the language behavior to the fullest potential.

• Understanding of accessibility and security compliances

• Ability to communicate complex technical concepts to both technical and non- technical audiences with ease

• Diversity in skills like version control tools, CI/CD, cloud basics, good debugging skills, and test-driven development will help your profile stand out. 

Read more
Draup
Bengaluru (Bangalore)
3 - 4 yrs
₹5L - ₹15L / yr
skill iconPython
skill iconJava
API
SQL

Job Description

• Role: Quality Assurance Engineer – Automation (3–4 yrs)

• Location: Bengaluru

• Type: Full-time

Why this role? Join a fast-moving team that’s pushing test automation into the AI era. You’ll own end-to-end quality for web, mobile and API layers, combining Playwright (or similar) with next-gen, AI-driven test platforms to deliver smarter, faster releases.

What you’ll do

• Build & maintain automation with Playwright, Selenium, Cypress or equivalent

• Super-charge coverage using AI-powered tools

• Create, run and optimize manual, API (Postman/Rest Assured) and database (SQL) tests

• Triage results, file defects in Jira, and drive them to closure What you bring

• 3–4 years’ hands-on automation experience

• Strong with Playwright (preferred) or Selenium/Cypress and one scripting language (JS/TS, Python or Java)

• Familiarity with AI-based testing platforms

• Solid API testing & SQL skills; sound grasp of STLC and defect management

• Clear communicator with sharp analytical instincts

• Nice to have: BDD (Cucumber/SpecFlow), performance testing (JMeter/LoadRunner), TestRail/Zephyr, ML model validation Qualifications Bachelor’s in Computer Science, Engineering or related field What’s in it for you?

• Hands-on exposure to cutting-edge AI test automation

• Ownership and room to innovate in a collaborative, high-impact environment

• Competitive pay, flexible policies and a fun teaM

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort