Cutshort logo
Python Jobs in Bangalore (Bengaluru)

50+ Python Jobs in Bangalore (Bengaluru) | Python Job openings in Bangalore (Bengaluru)

Apply to 50+ Python Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
HeyCoach
DeepanRaj R
Posted by DeepanRaj R
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹15L / yr
skill iconPython
Python core
backend
skill iconMachine Learning (ML)
API
+3 more

SDE II – Backend (Voice AI Platform) | Nexa

Location: HSR Layout, Bangalore - WFO

Type: Full-time

Experience: 2+ years (preferably in early-stage startups)

Tech Stack: Python (core)


About Nexa

Nexa is a new venture by the founders of HeyCoach—Pratik Kapasi and Aditya Kamat—on a mission to build the most intuitive voice-first AI platform.

We're rethinking how humans interact with machines through natural, intelligent, and fast conversational interfaces.

We're looking for a Software Development Engineer II to join us at the ground level. This is a high-ownership, high-velocity role for builders who want to move fast and go deep.


What You’ll Do

  • Design, build, and scale backend systems for our voice AI engine
  • Work primarily with Python for core logic, pipelines, and model integration
  • Lead projects end-to-end—from whiteboard to production deployment
  • Optimize systems for performance, scale, and real-time processing
  • Collaborate closely with founders, ML engineers, and designers to rapidly prototype and ship features
  • Set engineering best practices, own code quality, and mentor junior engineers as the team grows
  • Optionally contribute to full-stack development using Node.js and React.js


Must-Have Skills


  • 2+ years of experience with Python, building scalable production systems
  • Proven experience independently leading projects from design through deployment
  • Strong foundation in system design, algorithms, and data structures
  • Deep understanding of backend architecture—APIs, microservices, data flows
  • Proven ability to debug and optimize complex systems
  • High autonomy—can break down big problems, prioritize, and deliver without hand-holding
  • Prior success in early-stage startup environments, especially during 0→1 phases


Nice-to-Have

  • Exposure to Node.js and React.js (Preferred but not mandatory)
  • Experience with NLP, speech interfaces, or audio processing
  • Familiarity with cloud platforms (GCP/AWS), CI/CD, Docker, Kubernetes
  • Contributions to open-source projects or technical writing
  • Prior experience integrating ML models into production systems


What We Value

  • Speed > Perfection: We ship early and iterate fast
  • Ownership mindset: You think and act like a founder
  • Technical depth: You’ve built things from scratch and understand how they work
  • Product intuition: You care about solving user problems, not just writing code
  • Startup muscle: You’re scrappy, resourceful, and thrive without process overhead
  • Bias for action: You unblock yourself and others—fast
  • Humility and curiosity: You challenge ideas, accept better ones, and never stop learning


Why Join Nexa?


  • Work directly with the founders on groundbreaking voice AI products
  • Be part of the core team shaping the product and technology from day one
  • High-trust environment focused on impact—not hours
  • Flexible work style and a fast, flat, and collaborative culture


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Poornima Varadarajan
Posted by Poornima Varadarajan
Bengaluru (Bangalore), Mumbai
5 - 7 yrs
Best in industry
API
skill iconJava
Banking
skill iconPython
API QA

Design, develop and maintain robust test automation frameworks for financial applications

 Create detailed test plans, test cases, and test scripts based on business requirements and user stories

 Execute functional, regression, integration, and API testing with a focus on financial data integrity

 Validate complex financial calculations, transaction processing, and reporting functionalities

 Collaborate with Business Analysts and development teams to understand requirements and ensure complete test coverage

 Implement automated testing solutions within CI/CD pipelines for continuous delivery

 Perform data validation testing against financial databases and data warehouses

 Identify, document, and track defects through resolution using defect management tools

 Verify compliance with financial regulations and industry standards

Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
3 - 6 yrs
₹6L - ₹11L / yr
skill iconPython
skill iconPostgreSQL
MySQL
skill iconDjango
skill iconAmazon Web Services (AWS)
+3 more

Senior Software Engineer - Backend


A Senior Software Backend Engineer is responsible for designing, building, and maintaining the server-side

logic and infrastructure of web applications or software systems. They typically work closely with frontend

engineers, DevOps teams, and other stakeholders to ensure that the back-end services perform optimally and

meet business requirements. Below is an outline of a typical Senior Backend Developer job profile:


Key Responsibilities:

1. System Architecture & Design:

- Design scalable, high-performance backend services and APIs.

- Participate in the planning, design, and development of new features.

- Ensure that systems are designed with fault tolerance, security, and scalability in mind.

2. Development & Implementation:

- Write clean, maintainable, and efficient code.

- Implement server-side logic, databases, and data storage solutions.

- Work with technologies like REST, GraphQL, and other backend communication methods.

- Design and optimize database schemas, queries, and indexes.

3. Performance Optimization:

- Diagnose and fix performance bottlenecks.

- Optimize backend processes and database queries for speed and efficiency.

- Implement caching strategies and load balancing.

4. Security:

- Ensure the security of the backend systems by implementing secure coding practices.

- Protect against common security threats such as SQL injection, cross-site scripting (XSS), and others.

5. Collaboration & Leadership:

- Collaborate with frontend teams, product managers, and DevOps engineers.

- Mentor junior developers and guide them in best practices.

- Participate in code reviews and ensure that the development team follows consistent coding standards.

6. Testing & Debugging:

- Develop and run unit, integration, and performance tests to ensure code quality.

- Troubleshoot, debug, and upgrade existing systems.

7. Monitoring & Maintenance:

- Monitor system performance and take preventive measures to ensure uptime and reliability.

- Maintain technical documentation for reference and reporting.

- Stay updated on emerging technologies and incorporate them into the backend tech stack.


Required Skills:

1. Programming Languages:

- Expertise in one or more backend programming languages in the list Python, Java, Go, Rust.

2. Database Management:

- Strong understanding of both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g.,

MongoDB, Redis).

- Knowledge of data modeling, query optimization, and database scaling strategies.

3. API Design & Development:

- Proficiency in designing and implementing RESTful and GraphQL APIs.

- Experience with microservices architecture.

- Good understanding of containers

4. Cloud & DevOps:

- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.

- Understanding of DevOps principles, CI/CD pipelines, containerization (Docker), and orchestration

(Kubernetes).

5. Version Control:

- Proficiency with Git and branching strategies.

6. Testing & Debugging Tools:

- Familiarity with testing frameworks, debugging tools, and performance profiling.

7. Soft Skills:

- Strong problem-solving skills.

- Excellent communication and teamwork abilities.

- Leadership and mentorship qualities.


Qualifications:

- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field.

- 5+ years of experience in backend development or software engineering.

- Proven experience with system design, architecture, and high-scale application development.


Preferred Qualifications:

- Experience with distributed systems, event-driven architectures, and asynchronous processing.

- Familiarity with message queues (e.g., RabbitMQ, Kafka) and caching layers (e.g., Redis, Memcached).

- Knowledge of infrastructure as code (IaC) tools like Terraform or Ansible.


Tools & Technologies:

- Languages: Python, Java, Golang, Rust.

- Databases: PostgreSQL, MySQL, MongoDB, Redis, Cassandra.

- Frameworks: Django, Flask, Spring Boot, Go Micro.

- Cloud Providers: AWS, Azure, Google Cloud.

- Containerization: Docker, Kubernetes.

- CI/CD: Jenkins, GitLab CI, CircleCI.

This job profile will vary depending on the company and industry, but the core principles of designing,

developing, and maintaining back-end systems remain the same.

Read more
eazeebox

at eazeebox

3 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹15L / yr (Varies
)
skill iconPython
skill iconReact Native
SQL
NOSQL Databases
skill iconAmazon Web Services (AWS)

About Eazeebox

Eazeebox is India’s first specialized B2B platform for home electrical goods. We simplify supply chain logistics and empower electrical retailers through our one-stop digital platform — offering access to 100+ brands across 15+ categories, no MOQs, flexible credit options, and 4-hour delivery. We’re on a mission to bring technological inclusion to India's massive electrical retail industry.


Role Overview

We’re looking for a hands-on Full Stack Engineer who can build scalable backend systems using Python and mobile applications using React Native. You’ll work directly with the founder and a lean engineering team to architect and deliver core modules across our Quick Commerce stack – including retailer apps, driver apps, order management systems, and more.


What You’ll Do

  • Develop and maintain backend services using Python
  • Build and ship high-performance React Native apps for Android and iOS
  • Collaborate on API design, microservices, and systems integration
  • Ensure performance, reliability, and scalability across the stack
  • Contribute to decisions on re-engineering, tech stack, and infra setup
  • Work closely with the founder and product team to own end-to-end delivery
  • Participate in collaborative working sessions and pair programming when needed


What We’re Looking For

  • Strong proficiency in Python for backend development
  • Experience building mobile apps with React Native
  • Solid understanding of microservices architecture, API layers, and shared data models
  • Familiarity with AWS or equivalent cloud platforms
  • Exposure to Docker, Kubernetes, and CI/CD pipelines
  • Ability to thrive in a fast-paced, high-ownership environment


Good-to-Have (Bonus Points)

  • Experience working in Quick Commerce, logistics, or consumer apps
  • Knowledge of PIM (Product Information Management) systems
  • Understanding of key commerce algorithms (search, ranking, filtering, order management)
  • Ability to use AI-assisted coding tools to speed up development


Why Join Us

  • Build from scratch, not maintain legacy
  • Work directly with the founder and influence tech decisions
  • Shape meaningful digital infrastructure for a $35B+ industry
  • Backed by revenue – 3 years of market traction and growing fast
Read more
Peliqan

at Peliqan

3 recruiters
Bharath Kumar
Posted by Bharath Kumar
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹12L / yr
skill iconPython
SQL
API


About the Role


We are looking for a Python Developer with expertise in data synchronization (ETL & Reverse ETL), automation workflows, AI functionality, and connectivity to work directly with a customer in Peliqan. In this role, you will be responsible for building seamless integrations, enabling AI-driven functionality, and ensuring data flows smoothly across various systems.

Key Responsibilities

  • Build and maintain data sync pipelines (ETL & Reverse ETL) to ensure seamless data transfer between platforms.
  • Develop automation workflows to streamline processes and improve operational efficiency.
  • Implement AI-driven functionality, including AI-powered analytics, automation, and decision-making capabilities.
  • Build and enhance connectivity between different data sources, APIs, and enterprise applications.
  • Work closely with the customer to understand their technical needs and design tailored solutions in Peliqan.
  • Optimize performance of data integrations and troubleshoot issues as they arise.
  • Ensure security and compliance in data handling and integrations.

Requirements

  • Strong experience in Python and related libraries for data processing & automation.
  • Expertise in ETL, Reverse ETL, and workflow automation tools.
  • Experience working with APIs, data connectors, and integrations across various platforms.
  • Familiarity with AI & machine learning concepts and their practical application in automation.
  • Hands-on experience with Peliqan or similar integration/data automation platforms is a plus.
  • Strong problem-solving skills and the ability to work directly with customers to define and implement solutions.
  • Excellent communication and collaboration skills.

Preferred Qualifications

  • Experience in SQL, NoSQL databases, and cloud platforms (AWS, GCP, Azure).
  • Knowledge of data governance, security best practices, and performance optimization.
  • Prior experience in customer-facing engineering roles.

If you’re a Python & Integration Engineer who loves working on cutting-edge AI, automation, and data connectivity projects, we’d love to hear from you


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Mumbai, Hyderabad, Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹6L - ₹25L / yr
skill iconPython
skill iconDjango
NumPy
skill iconFlask
pandas
+1 more

Python Developer Job Description

A Python Developer is responsible for designing, developing, and deploying software applications using the Python programming language. Here's a brief overview:


Key Responsibilities

- Software Development: Develop high-quality software applications using Python.

- Problem-Solving: Solve complex problems using Python programming language.

- Code Maintenance: Maintain and update existing codebases to ensure they remain efficient and scalable.

- Collaboration: Collaborate with cross-functional teams to identify and prioritize project requirements.

- Testing and Debugging: Write unit tests and debug applications to ensure high-quality code.


Technical Skills

- Python: Strong understanding of Python programming language and its ecosystem.

- Programming Fundamentals: Knowledge of programming fundamentals, including data structures, algorithms, and object-oriented programming.

- Frameworks and Libraries: Familiarity with popular Python frameworks and libraries, such as Django, Flask, or Pandas.

- Database Management: Understanding of database management systems, including relational databases and NoSQL databases.

- Version Control: Knowledge of version control systems, including Git.


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Chennai, Kochi (Cochin), Bengaluru (Bangalore), Kolkata, Thiruvananthapuram
4 - 8 yrs
₹5L - ₹20L / yr
skill iconMachine Learning (ML)
skill iconPython
MLOps

Machine Learning (ML) / MLOps Engineer Job Description

An ML/MLOps Engineer is responsible for designing, developing, and deploying machine learning models and pipelines. Here's a brief overview:


Key Responsibilities

- Model Development: Design and develop machine learning models using various algorithms and techniques.

- MLOps: Implement and manage machine learning pipelines, including data preparation, model training, and deployment.

- Model Deployment: Deploy machine learning models to production environments, ensuring scalability and reliability.

- Model Monitoring: Monitor model performance and retrain models as needed to maintain accuracy and relevance.

- Collaboration: Collaborate with cross-functional teams, including data scientists, product managers, and engineers.


Technical Skills

- Machine Learning: Strong understanding of machine learning concepts, including supervised and unsupervised learning, deep learning, and reinforcement learning.

- Programming: Proficiency in programming languages like Python, R, or Julia.

- ML Frameworks: Experience with machine learning frameworks like TensorFlow, PyTorch, or scikit-learn.

- MLOps Tools: Familiarity with MLOps tools like TensorFlow Extended (TFX), MLflow, or Kubeflow.

- Cloud Platforms: Experience with cloud platforms like AWS, Azure, or GCP.

Read more
Coimbatore, Bengaluru (Bangalore), Mumbai
1 - 4 yrs
₹3.4L - ₹5L / yr
skill iconPython
skill iconJavascript
skill iconJava
skill iconHTML/CSS
Big Data
+2 more

The Assistant Professor in CSE will teach undergraduate and graduate courses, conduct independent and collaborative research, mentor students, and contribute to departmental and institutional service.

Read more
Hyderabad, Bengaluru (Bangalore), Mumbai, Delhi, Pune, Chennai
0 - 1 yrs
₹10L - ₹20L / yr
skill iconPython
Object Oriented Programming (OOPs)
skill iconJavascript
skill iconJava
Data Structures
+1 more


About NxtWave


NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.

Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.

Know more:

🌐 NxtWave | NIAT

About the Role

As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.


Key Responsibilities

  • Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
  • Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
  • Mentor students in academic, career, and project development goals.
  • Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
  • Drive research-led content development, and contribute to innovation in teaching methodologies.
  • Support capstone projects, hackathons, and collaborative research opportunities with industry.
  • Foster a high-performance learning environment in classes of 70–100 students.
  • Collaborate with cross-functional teams for continuous student development and program quality.
  • Actively participate in faculty training, peer reviews, and academic audits.


Eligibility & Requirements

  • Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
  • Strong academic and research orientation, preferably with publications or project contributions.
  • Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
  • A deep commitment to education, student success, and continuous improvement.

Must-Have Skills

  • Expertise in Python, Java, JavaScript, and advanced programming paradigms.
  • Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
  • Excellent communication, classroom delivery, and presentation skills.
  • Familiarity with academic content tools like Google Slides, Sheets, Docs.
  • Passion for educating, mentoring, and shaping future developers.

Good to Have

  • Industry experience or consulting background in software development or research-based roles.
  • Proficiency in version control systems (e.g., Git) and agile methodologies.
  • Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
  • A drive to innovate in teaching, curriculum design, and student engagement.

Why Join Us?

  • Be at the forefront of shaping India’s tech education revolution.
  • Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
  • Competitive compensation with strong growth potential.
  • Create impact at scale by mentoring hundreds of future-ready tech leaders.


Read more
 engineering and technology company

engineering and technology company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
12 - 15 yrs
₹20L - ₹25L / yr
DevOps
Android.mk
Gradle
cicd
skill iconJenkins
+5 more

Job Overview

  • •   Required experience from 8 years – 12 year in Devops/System Debug.
  • •   Develop and maintain an automated infrastructure of continuous integration and deployment (CI/CD).
  • •   Have experience of creating automated CI/CD pipeline by using tools like Gitlab.
  • •   Demonstrated capability with CI/CD tools such as Jenkins, Git/Gerrit, JFrog(Artifactory, xRay, Pipelines).
  • •   Strong development expertise in Python and Linux scripting languages.
  • •   Have strong knowledge of UNIX, Linux.
  • •   Knowledge on Android Build System (Android.mk , Android.bp, gradle)
  • •   Unit testing/Integration testing and code-coverage tools.
  • •   -Have knowledge of deploying containers by using containerization tools like docker.
  • •   Excellent problem solving and debugging skills and can take ownership on the CI/CD configuration.
  • •   Eliminate variation by working with global engineering teams to define and implement common processes and configuration that work for all projects.
  • •   Maintain and update current scripts/tools to support an evolving software
  • •   Good team player and should follow agile development methodologies and ASPICE practice as part of SW development lifecycle.
  • •   Good understanding of Quality control and Test automation in Agile based Continuous Integration environment.
Read more
Aeries Technology

at Aeries Technology

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
7 - 12 yrs
Upto ₹42L / yr (Varies
)
DevOps
skill iconJava
skill iconPython
Groovy
skill iconC#

This role is part of the Quickbase Center of Excellence, a global initiative operated in partnership with Aeries, and offers an exciting opportunity to work on cutting-edge DevOps technologies with strong collaboration across teams in the US, Bulgaria, and India.

Key Responsibilities

  • Build and manage CI/CD pipelines across environments
  • Automate infrastructure provisioning and deployments using Infrastructure as Code (IaC)
  • Develop internal tools and scripts to boost developer productivity
  • Set up and maintain monitoring, alerting, and performance dashboards
  • Collaborate with cross-functional engineering teams to ensure infrastructure scalability and security
  • Contribute to the DevOps Community of Practice by sharing best practices and tools
  • Continuously evaluate and integrate new technologies and DevOps trends

Skills & Experience Required

  • Strong scripting experience: Bash, PowerShell, Python, or Groovy
  • Hands-on with containerization tools like Docker and Kubernetes
  • Proficiency in Infrastructure as Code: Terraform, CloudFormation, or Azure Templates
  • Experience with CI/CD tools such as Jenkins, TeamCity, GitHub Actions, or CircleCI
  • Exposure to Serverless computing (AWS Lambda or Google App Engine)
  • Cloud experience with AWS, GCP, or Azure
  • Solid understanding of networking concepts: DNS, DHCP, SSL, subnets
  • Experience with monitoring tools and alerting platforms
  • Basic understanding of security principles and best practices
  • Prior experience working directly with software engineering teams

Preferred Qualifications

  • Bachelor’s degree in Computer Science or related discipline
  • Strong communication skills (verbal & written)
  • Ability to work effectively in a distributed, high-performance team
  • Passion for DevOps best practices and a continuous learning mindset
  • Customer-obsessed and committed to improving engineering efficiency

Why Join Us?

  • Quickbase Center of Excellence: Purpose-built team delivering excellence from Bangalore
  • Fast-Growing Environment: Be part of a growing company with strong career advancement
  • Innovative Tech Stack: Exposure to cutting-edge tech in cloud, AI, and DevOps tooling
  • Inclusive Culture: ERGs and leadership development programs to support growth
  • Global Collaboration: Work closely with teams across the US, Bulgaria, and India

About Quickbase

Quickbase is a leading no-code platform that empowers organizations to create enterprise applications without writing code. Founded in 1999 and trusted by over 6,000 customers, Quickbase helps companies connect data, streamline workflows, and achieve real-time insights.

Learn more: https://www.quickbase.com

Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹30L / yr
Hadoop
Apache Kafka
Spark
redshift
skill iconPython
+9 more

Senior Data Engineer Job Description

Overview

The Senior Data Engineer will design, develop, and maintain scalable data pipelines and

infrastructure to support data-driven decision-making and advanced analytics. This role requires deep

expertise in data engineering, strong problem-solving skills, and the ability to collaborate with

cross-functional teams to deliver robust data solutions.

Key Responsibilities


Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data

pipelines to ingest, process, and transform large volumes of structured and unstructured data.

Data Architecture: Architect and maintain data storage solutions, including data lakes, data

warehouses, and databases, ensuring performance, scalability, and cost-efficiency.

Data Integration: Integrate data from diverse sources, including APIs, third-party systems,

and streaming platforms, ensuring data quality and consistency.

Performance Optimization: Monitor and optimize data systems for performance, scalability,

and cost, implementing best practices for partitioning, indexing, and caching.

Collaboration: Work closely with data scientists, analysts, and software engineers to

understand data needs and deliver solutions that enable advanced analytics, machine

learning, and reporting.

Data Governance: Implement data governance policies, ensuring compliance with data

security, privacy regulations (e.g., GDPR, CCPA), and internal standards.

Automation: Develop automated processes for data ingestion, transformation, and validation

to improve efficiency and reduce manual intervention.

Mentorship: Guide and mentor junior data engineers, fostering a culture of technical

excellence and continuous learning.

Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high

availability and reliability of data systems.

Required Qualifications

Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,

or a related field.

Experience: 5+ years of experience in data engineering or a related role, with a proven track

record of building scalable data pipelines and infrastructure.

Technical Skills:

Proficiency in programming languages such as Python, Java, or Scala.

Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).

Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services

(e.g., Redshift, BigQuery, Snowflake).

Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and

data integration frameworks.

Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed

systems.

Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a

plus.

Soft Skills:

Excellent problem-solving and analytical skills.

Strong communication and collaboration abilities.

Ability to work in a fast-paced, dynamic environment and manage multiple priorities.

Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,

Google Professional Data Engineer) or relevant data engineering certifications.

Preferred Qualifica

Experience with real-time data processing and streaming architectures.

Familiarity with machine learning pipelines and MLOps practices.

Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data

pipelines.

Experience in industries with high data complexity, such as finance, healthcare, or

e-commerce.

Work Environment

Location: Hybrid/Remote/On-site (depending on company policy).

Team: Collaborative, cross-functional team environment with data scientists, analysts, and

business stakeholders.

Hours: Full-time, with occasional on-call responsibilities for critical data systems.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Remote, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
SDET
BDD
SQL
Data Warehouse (DWH)
+2 more

Primary skill set: QA Automation, Python, BDD, SQL 

As Senior Data Quality Engineer you will:

  • Evaluate product functionality and create test strategies and test cases to assess product quality.
  • Work closely with the on-shore and the offshore team.
  • Work on multiple reports validation against the databases by running medium to complex SQL queries.
  • Better understanding of Automation Objects and Integrations across various platforms/applications etc.
  • Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
  • Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
  • Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
  • Establish processes and tools set to maintain automation scripts and generate regular test reports.
  • Peer review to provide feedback and to make sure the test scripts are flaw-less.

Core/Must have skills:

  • Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
  • Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
  • Clear & crisp communication and commitment towards deliverables
  • Experience on BigData Testing will be an added advantage.
  • Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.

Good to have skills:

  • Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
  • Ability to effectively articulate technical challenges and solutions
  • Work experience in qTest, Jira, WebDriver.IO


Read more
Robylon AI

at Robylon AI

2 candid answers
Listings Robylon
Posted by Listings Robylon
Bengaluru (Bangalore)
0 - 2 yrs
₹5L - ₹6L / yr
skill iconPython
Generative AI
Prompt engineering

Role Overview

This is a 20% technical, 80% non-technical role designed for individuals who can blend technical know-how with strong operational and communication skills. You’ll be the bridge between our product and the client’s operations team.


Key Responsibilities


  • Collaborate with clients to co-design SOPs for resolving support queries across channels (chat, ticket, voice)
  • Scope and plan each integration: gather technical and operational requirements and convert them into an executable timeline with measurable success metrics (e.g., coverage %, accuracy, CSAT)
  • Lead integration rollouts and post-launch success loops: monitor performance, debug issues, fine-tune prompts and workflows
  • Conduct quarterly “AI health-checks” and continuously improve system effectiveness
  • Troubleshoot production issues, replicate bugs, ship patches, and write clear root-cause analyses (RCAs)
  • Act as the customer’s voice internally, channel key insights to product and engineering teams


Must-Have Qualifications


  • Engineering degree is a must; Computer Science preferred
  • Past experience in coding and a sound understanding of APIs is preferred
  • Ability to communicate clearly with both technical and non-technical stakeholders
  • Experience working in SaaS, customer success, implementation, or operations roles
  • Analytical mindset with the ability to make data-driven decisions



Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore), Hyderabad, Pune
4 - 8 yrs
₹20L - ₹25L / yr
Big Data
skill iconAmazon Web Services (AWS)
IaaS
Platform as a Service (PaaS)
VMS
+8 more

Job description

 

Job Title: Cloud Migration Consultant – (AWS to Azure)

 


Experience: 4+ years in application assessment and migration

 

About the Role

 

We’re looking for a Cloud Migration Consultant with hands-on experience assessing and migrating complex applications to Azure. You'll work closely with Microsoft business units, participating in Intake & Assessment and Planning & Design phases, creating migration artifacts, and leading client interactions. You’ll also support application modernization efforts in Azure, with exposure to AWS as needed.

 

Key Responsibilities

 

  • Assess application readiness and document architecture, dependencies, and migration strategy.
  • Conduct interviews with stakeholders and generate discovery insights using tools like Azure MigrateCloudockItPowerShell.
  • Create architecture diagramsmigration playbooks, and maintain Azure DevOps boards.
  • Set up applications both on-premises and in cloud environments (primarily Azure).
  •  Support proof-of-concepts (PoCs) and advise on migration options.
  •  Collaborate with application, database, and infrastructure teams to enable smooth transition to migration factory teams.
  •  Track progress, blockers, and risks, reporting timely status to project leadership.


Required Skills

 

  • 4+ years of experience in cloud migration and assessment
  •  Strong expertise in Azure IaaS/PaaS (VMs, App Services, ADF, etc.)
  •  Familiarity with AWS IaaS/PaaS (EC2, RDS, Glue, S3)
  •  Experience with Java (SpringBoot)/C#, .Net/PythonAngular/React.js, REST APIs
  • Working knowledge of KafkaDocker/KubernetesAzure DevOps
  •  Network infrastructure understanding (VNets, NSGs, Firewalls, WAFs)
  •  IAM knowledge: OAuth, SAML, Okta/SiteMinder
  •  Experience with Big Data tools like Databricks, Hadoop, Oracle, DocumentDB


Preferred Qualifications

 

  • Azure or AWS certifications
  •  Prior experience with enterprise cloud migrations (especially in Microsoft ecosystem)
  •  Excellent communication and stakeholder management skills


Educational qualification:

 

B.E/B.Tech/MCA

 

Experience :

 

4+ Years

 

Key Responsibilities

 

  • Assess application readiness and document architecture, dependencies, and migration strategy.
  •  Conduct interviews with stakeholders and generate discovery insights using tools like Azure MigrateCloudockItPowerShell.
  •  Create architecture diagramsmigration playbooks, and maintain Azure DevOps boards.
  •  Set up applications both on-premises and in cloud environments (primarily Azure).
  •  Support proof-of-concepts (PoCs) and advise on migration options.
  •  Collaborate with application, database, and infrastructure teams to enable smooth transition to migration factory teams.
  •  Track progress, blockers, and risks, reporting timely status to project leadership.


Required Skills

 

  • 4+ years of experience in cloud migration and assessment
  •  Strong expertise in Azure IaaS/PaaS (VMs, App Services, ADF, etc.)
  •  Familiarity with AWS IaaS/PaaS (EC2, RDS, Glue, S3)
  •  Experience with Java (SpringBoot)/C#, .Net/PythonAngular/React.js, REST APIs
  •  Working knowledge of KafkaDocker/KubernetesAzure DevOps
  •  Network infrastructure understanding (VNets, NSGs, Firewalls, WAFs)
  •  IAM knowledge: OAuth, SAML, Okta/SiteMinder
  •  Experience with Big Data tools like Databricks, Hadoop, Oracle, DocumentDB


Preferred Qualifications

 

  • Azure or AWS certifications
  •  Prior experience with enterprise cloud migrations (especially in Microsoft ecosystem)
  •  Excellent communication and stakeholder management skills


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
6 - 9 yrs
₹15L - ₹25L / yr
skill iconPython
Tableau
skill iconData Analytics
Google Cloud Platform (GCP)
PowerBI
+2 more

Job Overview:


  • JD of DATA ANALYST:



  • Strong proficiency in Python programming.
  • Preferred knowledge of cloud technologies, especially in Google Cloud Platform (GCP).
  • Experience with visualization tools such as Grafana, PowerBI, and Tableau.
  • Good to have knowledge of AI/ML models.
  • Must have extensive knowledge in Python analytics, particularly in exploratory data analysis (EDA).
Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
3 - 6 yrs
₹18L - ₹20L / yr
PySpark
skill iconData Science
skill iconPython
NumPy
Generative AI
+8 more

Job Overview : Data scientist (AI/ML)


  • 3 TO 6 years experience in AI/ML 
  • Programming languages: Python, SQL, NoSQL
  • Frameworks: Spark(Pyspark), Scikit-learn, Scipy, Numpy, NLTK
  • DL Frameworks : Tensotflow, Pytorch, LLMs(Transformers/deepseek/ llama), huggingface, llm deployment and inference
  • Gen AI Framework: Langchain
  • Cloud:AWS
  • Tools: Tableau, Grafana

 


  • LLM, GENAI, OCR(optical character recognition)
  •  
  • Notice Period: Immediate to 15 Days
Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune, Bengaluru (Bangalore), Hyderabad
7 - 12 yrs
₹25L - ₹30L / yr
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
Migration
skill iconPython
AWS services
+3 more

Job Title: Senior Node.js and Python Azure developer ( AWS to Azure Migration expert)

 

Experience: 7-10 Yrs.

 

Primary Skills:

 

Node.js and Python

 

Hands-on experience with Azure, Serverless (Azure Functions)

 

AWS to Azure Cloud Migration (Preferred)

 

 Scope of Work:

 

  • Hand-on experience in migration of Node.js and Python application from AWS to Azure environment
  •  
  • Analyse source architecture, Source code and AWS service dependencies to identify code remediations scenarios.
  •  
  • Perform code remediations/Refactoring and configuration changes required to deploy the application on Azure, including Azure service dependencies and other application dependencies remediations at source code. 
  •  
  • 7+ years of experience in application development with Node.js and Python
  •  
  • Experience in Unit testing, application testing support and troubleshooting on Azure. 
  •  
  • Experience in application deployment scripts/pipelines, App service, APIM, AKS/Microservices/containerized apps, Kubernetes, helm charts. 
  •  
  • Hands-on experience in developing apps for AWS and Azure (Must Have)
  •  
  • Hands-on experience with Azure services for application development (AKS, Azure Functions) and deployments. 
  •  
  • Understanding of Azure infrastructure services required for hosting applications on Azure PaaS or Serverless. 
  •  
  •  Tech stack details:
  •  
  • Confluent Kafka AWS S3 Sync connector
  •  
  • Azure Blob Storage
  •  
  • AWS lambda to Azure Functions (Serverless) – Python or Node.js
  •  
  • NodeJS REST API
  •  
  • S3 to Azure Blob Storage
  •  
  • AWS to Azure SDK Conversion (Must Have)

 

 

Educational qualification:

 

B.E/B.Tech/MCA

 


Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
Pace Wisdom Solutions
Bengaluru (Bangalore)
2 - 5 yrs
₹5L - ₹12L / yr
Odoo (OpenERP)
skill iconPython
skill iconJavascript
skill iconHTML/CSS

Location: Bengaluru/Mangaluru 

Experience required: 2-5 years 

Key skills:  Odoo Development, Python, Frontend Technologies 

Designation: SE L1/L2/L3/ ATL 

 

Job Summary:  

We are seeking a skilled and proactive Odoo Developer to join our dynamic team. The ideal candidate will have hands-on experience in customizing, developing, and maintaining Odoo modules, with a deep understanding of Python and business processes. You will play a key role in requirement gathering, technical design, development, testing, and deployment.  


Key Responsibilities:  

  • Develop, customize, and maintain Odoo modules as per business requirements.  
  • Analyze, design, and develop new modules and features in Odoo ERP.  
  • Troubleshoot, debug, and upgrade existing Odoo modules.  
  • Integrate Odoo with third-party platforms using APIs/web services.  
  • Provide technical support and training to end-users.  
  • Collaborate with functional consultants and stakeholders to gather requirements and deliver scalable ERP solutions.  
  • Write clean, reusable, and efficient Python code and maintain technical documentation.  


Required Skills & Qualifications:  

  • 2-5 years of proven experience as an Odoo Developer.  
  • Strong knowledge of Python, PostgreSQL, and Odoo framework (ORM, QWeb, XML).  
  • Experience in Odoo custom module development and Odoo standard modules   
  • Good understanding of Odoo backend and frontend (JavaScript, HTML, CSS).  
  • Experience with Odoo APIs and web services (REST/SOAP).  
  • Familiarity with Linux environments, Git version control.  
  • Ability to work independently and in a team with minimal supervision.  
  • Good analytical and problem-solving skills.  
  • Strong verbal and written communication skills. Knowledge of Odoo deployment (Linux, Docker, Nginx, Odoo.sh) is a plus 

 

About the Company:   


Pace Wisdom Solutions is a deep-tech Product engineering and consulting firm. We have offices in San Francisco, Bengaluru, and Singapore. We specialize in designing and developing bespoke software solutions that cater to solving niche business problems.  


We engage with our clients at various stages:  


  • Right from the idea stage to scope out business requirements.  
  • Design & architect the right solution and define tangible milestones.  
  • Setup dedicated and on-demand tech teams for agile delivery.  
  • Take accountability for successful deployments to ensure efficient go-to-market Implementations. 
Read more
Intellikart Ventures LLP
ramandeep intellikart
Posted by ramandeep intellikart
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹25L / yr
Langchaing
langgraph
Linux kernel
LLMs
Prompt engineering
+3 more

Job Summary:

We are hiring a Data Scientist – Gen AI with hands-on experience in developing Agentic AI applications using frameworks like LangChain, LangGraph, Semantic Kernel, or Microsoft Copilot. The ideal candidate will be proficient in Python, LLMs, and prompt engineering techniques such as RAG and Chain-of-Thought prompting.


Key Responsibilities:

  • Build and deploy Agent AI applications using LLM frameworks.
  • Apply advanced prompt engineering (Zero-Shot, Few-Shot, CoT).
  • Integrate Retrieval-Augmented Generation (RAG).
  • Develop scalable solutions in Python using NumPy, Pandas, TensorFlow/PyTorch.
  • Collaborate with teams to deliver business-aligned Gen AI solutions.


Must-Have Skills:

  • Experience with LangChain, LangGraph, or similar (priority given).
  • Strong understanding of LLMs, RAG, and prompt engineering.
  • Proficiency in Python and relevant ML libraries.


Nice-to-Have:

  • Wrapper API development for LLMs.
  • REST API integration within Agentic workflows.


Qualifications:

  • Bachelor’s/Master’s in CS, Data Science, AI, or related.
  • 4–7 years in AI/ML/Data Science, with 1–2 years in Gen AI/LLMs.
Read more
Edstellar.com

at Edstellar.com

2 candid answers
partha Sarathy
Posted by partha Sarathy
Bengaluru (Bangalore)
0 - 0 yrs
₹3L - ₹3L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconPython
skill iconGit
Version Control
+3 more

Greetings from Edstellar

we are looking for Vibe Coder for entry Level


Position Overview

We're seeking passionate fresh graduates who are natural Vibe Coders - developers who code with intuition, creativity, and genuine enthusiasm for building amazing applications. Perfect for recent grads who bring fresh energy and innovative thinking to development.


Key Responsibilities

Build dynamic web and mobile applications with creative flair

Code with passion and embrace experimental approaches

Learn and implement emerging technologies rapidly

Collaborate in our innovation-friendly environment

Prototype ideas and iterate with speed and creativity

Bring fresh perspectives to development challenges


Required Qualifications

Education: Bachelor's in Computer Science/IT or related field

Experience: Fresh graduate (0-1 years)


Technical Skills:

Solid programming fundamentals (any language)

Basic web development (HTML, CSS, JavaScript)

Understanding of application development concepts

Familiarity with Git/version control

Creative problem-solving mindset


Preferred:

Good understanding in Python, JavaScript frameworks, or modern tech stack

AI tool familiarity

Mobile development interest

Open source contributions


Vibe Coder DNA

Passionate about coding and building innovative apps

Thrives with creative freedom and flexible approaches

Loves experimenting with new technologies

Values innovation and thinking outside the box

Natural curiosity and eagerness to learn

Collaborative spirit with independent drive

Resilient and adaptable to change



Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
IndArka Energy Pvt Ltd

at IndArka Energy Pvt Ltd

3 recruiters
Mita Hemant
Posted by Mita Hemant
Bengaluru (Bangalore)
3 - 4 yrs
₹18L - ₹20L / yr
skill iconPython
skill iconDjango
Data Structures
Algorithms

About us

Arka energy is focussed on changing the paradigm on energy. Arka focusses on creating innovative renewable energy solutions for residential customers. With its custom product design and an innovative approach to market the product solution, Arka aims to be a leading provider of energy solutions in the residential solar segment. Arka designs and develops end to end renewable energy solutions with teams in Bangalore and in the Bay area

This product is a 3d simulation software, to replicate rooftops/commercial sites, place solar panels and generate the estimation of solar energy.

What are we looking for?

·        As a backend developer you will be responsible for developing solutions that will enable Arka solutions to be easily adopted by customers.

·        Attention to detail and willingness to learn is a big part of this position.

·        Commitment to problem solving, and innovative design approaches are important.

Role and responsibilities

●       Develop cloud-based Python Django software products

●       Working closely with UX and Front-end Developers

●       Participating in architectural, design and product discussions Designing and creating RESTful APIs for internal and partner consumption

●       Working in an agile environment with an excellent team of engineers

●       Own/maintain code everything from development to fixing bugs/issues.

●       Deliver clean, reusable high-quality code

●       Facilitate problem diagnosis and resolution for issues reported by Customers

●       Deliver to schedule and timelines based on an Agile/Scrum-based approach

●       Develop new features and ideas to make product better and user centric.

●       Must be able to independently write code and test major features, as well as work jointly with other team members to deliver complex changes

●       Create algorithms from scratch and implement them in the software.

●       Code Review, End to End Unit Testing.

●       Guiding and monitoring Junior Engineers.



SKILL REQUIREMENTS

●       Solid database skills in a relational database (i.e. PostgresSQL, MySQL, etc.) Knowledge of how to build and use with RESTful APIs

●        Strong knowledge of version control (i.e. git, svn, etc.)

●        Experience deploying Python applications into production

●        Azure or Google cloud infrastructure knowledge is a plus

●       Strong drive to learn new technologies

●       Ability to learn new technologies quickly

●       Continuous look-out for new and creative solutions to implement new features or improve old ones

●       Data Structures, Algorithms, Django and Python

 

 

 

Good To have

·        Knowledge on GenAI Applications.

 

 

Key Benefits

·        Competitive development environment

·        Engagement into full scale systems development

·        Competitive Salary

·        Flexible working environment

·        Equity in an early-stage start-up

·        Patent Filing Bonuses

·        Health Insurance for Employee + Family

 

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Hyderabad, Bengaluru (Bangalore), Pune
6 - 11 yrs
₹8L - ₹26L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)

POSITION / TITLE: Data Science Lead

Location: Offshore – Hyderabad/Bangalore/Pune

Who are we looking for?

Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques. 

‎ The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.

We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.

Responsibilities:

· Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

· Perform data cleaning, feature engineering, and exploratory data analysis.  

· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

· Collaborate with other DS and engineers to deliver projects.

Technical Skills – Must have:

· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

· Ability to translate complex machine learning problem statements into specific deliverables and requirements.

· Should have worked with major cloud platforms such as AWS, Azure or GCP.

· Working knowledge of SQL and no-SQL databases.

· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

· Strong understanding of evaluation and monitoring metrics for machine learning projects.

Technical Skills – Good to have:

· Track record of getting ML models into production

· Experience building chatbots.

· Experience with closed and open source LLMs.

· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…

· Certifications or courses in data science.

Education:

· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field. 

Process Skills:

· Understanding of  Agile and Scrum  methodologies.  

· Ability to follow SDLC processes and contribute to technical documentation.  

Behavioral Skills :

· Self-motivated and capable of working independently with minimal management supervision.

· Well-developed design, analytical & problem-solving skills

· Excellent communication and interpersonal skills.  

· Excellent team player, able to work with virtual teams in several time zones.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Bhavya M
Posted by Bhavya M
Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Chef
skill iconPython

Key Responsibilities:

· Lead the design and implementation of scalable infrastructure using IaC principles.

· Develop and manage configuration management tools primarily with Chef.

· Write and maintain automation scripts in Python to streamline infrastructure tasks.

· Build, manage, and version infrastructure using Terraform.

· Collaborate with cloud architects and DevOps teams to ensure highly available, secure, and scalable systems.

· Provide guidance and mentorship to junior engineers.

· Monitor infrastructure performance and provide optimization recommendations.

· Ensure compliance with best practices for security, governance, and automation.

· Maintain and improve CI/CD pipelines with infrastructure integration.

· Support incident management, troubleshooting, and root cause analysis for infrastructure issues.


Required Skills & Experience:

· Strong hands-on experience in:

o Chef (Cookbooks, Recipes, Automation)

o Python (Scripting, automation tasks, REST APIs)

o Terraform (Modules, state management, deployments)

· Experience in AWS services (EC2, VPC, IAM, S3, etc.)

· Familiarity with Windows administration and automation.

· Solid understanding of CI/CD processes, infrastructure lifecycle, and Git-based workflow

Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai, Bengaluru (Bangalore)
1 - 6 yrs
₹15L - ₹30L / yr
MLOps
MLFlow
kubeflow
Windows Azure
skill iconMachine Learning (ML)
+4 more

About Moative

Moative, an Applied AI Services company, designs AI roadmaps, builds co-pilots and predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries. Through Moative Labs, we aspire to build micro-products and launch AI startups in vertical markets.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Role

We seek experienced ML/AI professionals with strong backgrounds in computer science, software engineering, or related elds to join our Azure-focused MLOps team. If you’re passionate about deploying complex machine learning models in real-world settings, bridging the gap between research and production, and working on high-impact projects, this role is for you.


Work you’ll do

As an operations engineer, you’ll oversee the entire ML lifecycle on Azure—spanning initial proofs-of-concept to large-scale production deployments. You’ll build and maintain automated training, validation, and deployment pipelines using Azure DevOps, Azure ML, and related services, ensuring models are continuously monitored, optimized for performance, and cost-eective. By integrating MLOps practices such as MLow and CI/CD, you’ll drive rapid iteration and experimentation. In close collaboration with senior ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade ML solutions that directly impact business outcomes. 


Responsibilities

  • ML-focused DevOps: Set up robust CI/CD pipelines with a strong emphasis on model versioning, automated testing, and advanced deployment strategies on Azure.
  • Monitoring & Maintenance: Track and optimize the performance of deployed models through live metrics, alerts, and iterative improvements.
  • Automation: Eliminate repetitive tasks around data preparation, model retraining, and inference by leveraging scripting and infrastructure as code (e.g., Terraform, ARM templates).
  • Security & Reliability: Implement best practices for securing ML workows on Azure, including identity/access management, container security, and data encryption.
  • Collaboration: Work closely with the data science teams to ensure model performance is within agreed SLAs, both for training and inference.


Skills & Requirements

  • 2+ years of hands-on programming experience with Python (PySpark or Scala optional).
  • Solid knowledge of Azure cloud services (Azure ML, Azure DevOps, ACI/AKS).
  • Practical experience with DevOps concepts: CI/CD, containerization (Docker, Kubernetes), infrastructure as code (Terraform, ARM templates).
  • Fundamental understanding of MLOps: MLow or similar frameworks for tracking and versioning.
  • Familiarity with machine learning frameworks (TensorFlow, PyTorch, XGBoost) and how to operationalize them in production.
  • Broad understanding of data structures and data engineering.


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, eiciency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less.


Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, x or improve – anything that isn’t done right, irrespective of who did it. Be selsh about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply here. We encourage you to apply even if you believe you do not meet all the requirements listed above.


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers.


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Potentiam
karishma raj
Posted by karishma raj
Bengaluru (Bangalore)
6 - 12 yrs
₹22L - ₹30L / yr
skill iconPython
skill iconDjango

About Potentiam

Potentiam helps SME companies build world-class offshore teams. Our model is our locations and your dedicated staff under your control. Potentiam have offices in Iasi in Romania, Bangalore and Cape Town, home to large liquid pools of offshore talent working for international companies. Potentiam's management team have had over 15 years' experience in building offshore teams, and have specialist functional expertise to support the transition offshore in technology, finance, operations, engineering, digital marketing, technology and analytics. For decades corporations' scale has enabled them to benefit from the cost and skills advantage of offshore operations. Now SME companies can enjoy a similar benefit through Potentiam without any upfront investment.


Location : Bangalore ( Hybrid)


Experience - 6+ Years



Professional Experience:

  • Experience using a Python backend web framework (like Django, Flask or FastAPI)
  • In particular, experience building performant and reliable APIs and integrations
  • Competency using SQL and ORMs
  • Some experience with frontend web development would be a bonus using a JavaScript framework (such as Vue.js or React)
  • Understanding of some of the following: Django Rest Framework, PostgreSQL, Celery, Docker, nginx, AWS

Benefits and Perks

  • Health Insurance
  • Referral Bonus
  • Performance Bonus
  • Flexible Working options


Job Types: Full-time, Permanent


Read more
Potentiam
Dipanjan Das
Posted by Dipanjan Das
Bengaluru (Bangalore)
5 - 10 yrs
₹25L - ₹35L / yr
skill iconPython
machine Learning models
NumPy
skill iconDocker

● Proven experience in training, evaluating and deploying machine learning models

● Solid understanding of data science and machine learning concepts

● Experience with some machine learning / data engineering machine learning tech in Python (such as numpy, pytorch, pandas/polars, airflow, etc)

● Experience developing data products using large language model, prompt engineering, model evaluation.

● Experience with web services and programming (such as Python, docker, databases etc.)  

● Understanding of some of the following: FastAPI, PostgreSQL, Celery, Docker, AWS, Modal, git, continuous integration. 

Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹12L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
SQL
NOSQL Databases
+3 more

About Role

We are seeking a skilled Backend Engineer with 2+ years of experience to join our dynamic team, focusing on building scalable web applications using Python frameworks (Django/FastAPI) and cloud technologies. You'll be instrumental in developing and maintaining our cloud-native backend services.


Responsibilities:

  1. Design and develop scalable backend services using Django and FastAPI
  2. Create and maintain RESTful APIs
  3. Implement efficient database schemas and optimize queries
  4. Implement containerisation using Docker and container orchestration
  5. Design and implement cloud-native solutions using microservices architecture
  6. Participate in technical design discussions, code reviews and maintain coding standards
  7. Document technical specifications and APIs
  8. Collaborate with cross-functional teams to gather requirements, prioritise tasks, and contribute to project completion.

Requirements:

  1. Experience with Django and/or Fast-API (2+ years)
  2. Proficiency in SQL and ORM frameworks
  3. Docker containerisation and orchestration
  4. Proficiency in shell scripting (Bash/Power-Shell)
  5. Understanding of micro-services architecture
  6. Experience building server-less back end
  7. Knowledge of deployment and debugging on cloud platforms (AWS/Azure)
Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 8 yrs
₹6L - ₹14L / yr
skill iconMongoDB
skill iconDjango
WebSocket
skill iconRedux/Flux
SQL
+8 more

Role Overview

·        We are seeking a passionate and experienced Full Stack Developer skilled in MERN stack and Python (Django/Flask) to build and scale high-impact features across our web and mobile platforms. You will collaborate with cross-functional teams to deliver seamless user experiences and robust backend systems.

 

Key Responsibilities

·        Design, develop, and maintain scalable web applications using MySQL/Postgres, MongoDB, Express.js, React.js, and Node.js

·        Build and manage RESTful APIs and microservices using Python (Django/Flask/FastAPI)

·        Integrate with third-party platforms like OpenAI, WhatsApp APIs (Whapi), Interakt, and Zoho

·        Optimize performance across the frontend and backend

·        Collaborate with product managers, designers, and other developers to deliver high-quality features

·        Ensure security, scalability, and maintainability of code

·        Write clean, reusable, and well-documented code

·        Contribute to DevOps, CI/CD, and server deployment workflows (AWS/Lightsail)

·        Participate in code reviews and mentor junior developers if needed

 

Required Skills

·        Strong experience with MERN Stack: MongoDB, Express.js, React.js, Node.js

·        Proficiency in Python and web frameworks like Django, Flask, or FastAPI

·        Experience working with REST APIs, JWT/Auth, and WebSockets

·        Good understanding of frontend design systems, state management (Redux/Context), and responsive UI

·        Familiarity with database design and queries (MongoDB, PostgreSQL/MySQL)

·        Experience with Git, Docker, and deployment pipelines

·        Comfortable working in Linux-based environments (e.g., Ubuntu on AWS)

 

Bonus Skills

·        Experience with AI integrations (e.g., OpenAI, LangChain)

·        Familiarity with WooCommerce, WordPress APIs

·        Experience in chatbot development or WhatsApp API integration

 

Who You Are

·        You are a problem-solver with a product-first mindset

·        You care about user experience and performance

·        You enjoy working in a fast-paced, collaborative environment

·        You have a growth mindset and are open to learning new technologies

 

Why Join Us?

·        Work at the intersection of healthcare, community, and technology

·        Directly impact the lives of women across India and beyond

·        Flexible work environment and collaborative team

·        Opportunity to grow with a purpose-driven startup

Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 8 yrs
₹6L - ₹15L / yr
SQL
skill iconPython
skill iconData Analytics
Business Intelligence (BI)

Role Overview

We’re looking for a Data Analyst who is excited to work at the intersection of data, technology, and women’s wellness. You'll be instrumental in helping us understand user behaviour, community engagement, campaign performance, and product usage across platforms — including app, web, and WhatsApp.

You’ll also have opportunities to collaborate on AI-powered features such as chatbots and personalized recommendations. Experience with GenAI or NLP is a plus but not a requirement.

 

Key Responsibilities

·        Clean, transform, and analyse data from multiple sources (SQL databases, CSVs, APIs).

·        Build dashboards and reports to track KPIs, user behaviour, and marketing performance.

·        Collaborate with product, marketing, and customer teams to uncover actionable insights.

·        Support experiments, A/B testing, and cohort analysis to drive growth and retention.

·        Assist in documentation and communication of findings to technical and non-technical teams.

·        Work with the data team to enhance personalization and AI features (optional).

 

Required Qualifications

·        Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.

·        2 – 4 years of experience in data analysis or business intelligence.

·        Strong hands-on experience with SQL and Python (pandas, NumPy, matplotlib).

·        Familiarity with data visualization tools (Streamlit, Tableau, Metabase, Power BI, etc.)

·        Ability to translate complex data into simple visual stories and clear recommendations.

·        Strong attention to detail and a mindset for experimentation.

 

Preferred (Not Mandatory)

·        Exposure to GenAI, LLMs (e.g., OpenAI, HuggingFace), or NLP concepts.

·        Experience working with healthcare, wellness, or e-commerce datasets.

·        Familiarity with REST APIs, JSON structures, or chatbot systems.

·        Interest in building tools that impact women’s health and wellness. 


Why Join Us?

·        Be part of a high-growth startup tackling a real need in women’s healthcare.

·        Work with a passionate, purpose-driven team.

·        Opportunity to grow into GenAI/ML-focused roles as we scale.

·        Competitive salary and career progression

 

 

Best Regards,

Indrani Dutta

MIROR THERAPEUTICS PRIVATE LIMITED

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore)
8 - 12 yrs
Best in industry
skill iconPython
Terraform
Chef

Job Summary:

The Lead IaC Engineer will design, implement, automate, and maintain infrastructure across on-premises and cloud environments. This role should have strong hands-on expertise in Chef, Python, Terraform, and some AWS & Windows administration knowledge.


8-12 years of experience

Primary Skills – Chef, Python, and Terraform

Secondary – AWS & Windows admin (Cloud is not mandatory)

Read more
Trantor

at Trantor

1 recruiter
Nikita Sinha
Posted by Nikita Sinha
Remote, Bengaluru (Bangalore)
6 - 10 yrs
Upto ₹22L / yr (Varies
)
skill iconPython
SQL
CI/CD

We are looking for an experienced and detail-oriented Senior Performance Testing Engineer to join our QA team. The ideal candidate will be responsible for designing, developing, and executing scalable and reliable performance testing strategies. You will lead performance engineering initiatives using tools like Locust, Python, Docker, Kubernetes, and cloud-native environments (AWS), ensuring our systems meet performance SLAs under real-world usage patterns.


Key Responsibilities

  • Develop, enhance, and maintain Locust performance scripts using Python
  • Design realistic performance scenarios simulating real-world traffic and usage patterns
  • Parameterize and modularize scripts for robustness and reusability
  • Execute performance tests in containerized environments using Docker and Kubernetes
  • Manage performance test execution on Kubernetes clusters
  • Integrate performance tests into CI/CD pipelines in collaboration with DevOps and Development teams
  • Analyze performance test results, including throughput, latency, response time, and error rates
  • Identify performance bottlenecks, conduct root cause analysis, and suggest optimizations
  • Work with AWS (or other cloud platforms) to deploy, scale, and monitor tests in cloud-native environments
  • Write and optimize complex SQL queries, stored procedures, and perform DB performance testing
  • Work with SQL Server extensively; familiarity with Postgres is a plus
  • Develop and maintain performance testing strategies and test plans
  • Define and track KPIs, SLAs, workload models, and success criteria
  • Guide the team on best practices and promote a performance engineering mindset

Must-Have Qualifications

  • Proven hands-on experience with Locust and Python for performance testing
  • Working knowledge of microservices architecture
  • Hands-on with Kubernetes and Docker, especially in the context of running Locust at scale
  • Experience integrating performance tests in CI/CD pipelines
  • Strong experience with AWS or similar cloud platforms for deploying and scaling tests
  • Solid understanding of SQL Server, including tuning stored procedures and query optimization
  • Strong experience in performance test planning, execution, and analysis

Good-to-Have Skills

  • Exposure to Postgres DB
  • Familiarity with observability tools like Prometheus, Grafana, CloudWatch, and Datadog
  • Basic knowledge of APM (Application Performance Monitoring) tools
Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
FastAPI
skill iconFlask
API management
RESTful APIs
+8 more

Job Title : Python Developer – API Integration & AWS Deployment

Experience : 5+ Years

Location : Bangalore

Work Mode : Onsite


Job Overview :

We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.

The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.


Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.


Key Responsibilities :

Python Development & API Integration :

  • Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
  • Automate simulations and workflows using the PSS®E Python API (psspy).
  • Implement robust bulk case processing, result extraction, and automated reporting systems.


AWS Cloud Deployment :

  • Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
  • Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
  • Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.


Required Skills :

  • 5+ Years of professional experience in Python development.
  • Hands-on experience with RESTful API development (FastAPI/Flask).
  • Solid experience working with PSS®E and its psspy Python API.
  • Strong understanding of AWS services, deployment, and best practices.
  • Proficiency in automation, scripting, and report generation.
  • Knowledge of cloud security and monitoring tools like IAM and CloudWatch.

Good to Have :

  • Experience in power system simulation and electrical engineering concepts.
  • Familiarity with CI/CD tools for AWS deployments.
Read more
I-Stem

at I-Stem

2 candid answers
Sahil Garg
Posted by Sahil Garg
Bengaluru (Bangalore)
2 - 4 yrs
₹20L - ₹25L / yr
skill iconPython
PyTorch
TensorFlow
skill iconDocker
skill iconKubernetes
+2 more

You will:

  • Collaborate with the I-Stem Voice AI team and CEO to design, build and ship new agent capabilities
  • Develop, test and refine end-to-end voice agent models (ASR, NLU, dialog management, TTS)
  • Stress-test agents in noisy, real-world scenarios and iterate for improved robustness and low latency
  • Research and prototype cutting-edge techniques (e.g. robust speech recognition, adaptive language understanding)
  • Partner with backend and frontend engineers to seamlessly integrate AI components into live voice products
  • Monitor agent performance in production, analyze failure cases, and drive continuous improvement
  • Occasionally demo our Voice AI solutions at industry events and user forums


You are:

  • An AI/Software Engineer with hands-on experience in speech-centric ML (ASR, NLU or TTS)
  • Skilled in building and tuning transformer-based speech models and handling real-time audio pipelines
  • Obsessed with reliability: you design experiments to push agents to their limits and root-cause every error
  • A clear thinker who deconstructs complex voice interactions from first principles
  • Passionate about making voice technology inclusive and accessible for diverse users
  • Comfortable moving fast in a small team, yet dogged about code quality, testing and reproducibility


Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Bengaluru (Bangalore), Mumbai
7 - 14 yrs
₹15L - ₹30L / yr
skill iconPython
AWS Lambda
skill iconDocker
API
S3
+4 more

We are seeking a highly skilled Python Backend Developer with strong experience in building Microservices-based architectures and cloud-native server less solutions on AWS. The ideal candidate will be responsible for designing, developing, and maintaining scalable backend systems, ensuring high performance and responsiveness to requests from front-end applications and third-party systems.

 

Key Responsibilities:

  • Design and develop robust backend services and RESTful APIs using Python (FastAPI, Flask, or Django)
  • Build and deploy microservices that are scalable, loosely coupled, and independently deployable
  • Develop and manage serverless applications using AWS LambdaAPI GatewayDynamoDBS3SNSSQS, and Step Functions
  • Implement event-driven architectures and data processing pipelines
  • Collaborate with front-end developers, DevOps, and product teams to deliver high-quality software
  • Ensure code quality through unit testingintegration testing, and code reviews
  • Automate deployments using CI/CD pipelines and Infrastructure as Code (IaC) tools like CloudFormation or Terraform
  • Monitor, debug, and optimize backend systems for performance and scalability

 

Required Skills & Experience:

  • 7+ years of backend development experience using Python
  • Strong experience in designing and implementing microservices
  • Hands-on experience with AWS Serverless services: Lambda, API Gateway, S3, DynamoDB, SQS, SNS, etc.
  • Proficient in RESTful API design, JSON, and OpenAPI/Swagger specifications
  • Experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI)
  • Knowledge of CI/CD tools (e.g., GitHub Actions, Jenkins, CodePipeline)
  • Familiarity with Docker and containerized deployments
  • Strong understanding of software design patterns, clean code practices, and Agile methodologies

 

Nice to Have:

  • Experience with GraphQL or gRPC
  • Exposure to monitoring/logging tools (e.g., CloudWatch, ELK, Prometheus)
  • Knowledge of security best practices in API and cloud development
  • Familiarity with data streaming using Kafka or Kinesis


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sonali RajeshKumar
Posted by Sonali RajeshKumar
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Databases
SQL
IBM DB2
skill iconPython

Job Description: 

Years of Experience:- 5-8 Years

Location: Bangalore


Job Role:- Database Developer


Primary Skill - Database, SQL


Secondary skill - DB2 and Python


Skills:


Main Pointers for Database Developer role.


*Should have strong working experience on any Database like DB2(Good to Have) and SQL OR Oracle/ PL SQL etc.

*Should have working experience on performance tuning

Read more
Indee

at Indee

2 candid answers
1 recruiter
Nikita Sinha
Posted by Nikita Sinha
Remote, Bengaluru (Bangalore)
5yrs+
Upto ₹22L / yr (Varies
)
Selenium
skill iconPython
Manual testing
cypress
Test Automation (QA)
+2 more

Must-Have Skills & Qualifications:

  • Bachelor's degree in Engineering (Computer Science, IT, or related field)
  • 5–6 years of experience in manual testing of web and mobile applications
  • Working knowledge of test automation tools: Selenium
  • Experience with API testing using tools like Postman or equivalent
  • Experience with BDD
  • Strong understanding of test planning, test case design, and defect tracking processes
  • Experience leading QA for projects and production releases
  • Familiarity with Agile/Scrum methodologies
  • Effective collaboration skills – able to work with cross-functional teams and contribute to automation efforts as needed

Good-to-Have Skills:

  • Familiarity with CI/CD pipelines and version control tools (Git, Jenkins)
  • Exposure to performance or security testing
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Hyderabad
4 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
Data engineering
skill iconAmazon Web Services (AWS)
RESTful APIs
Microservices
+9 more

Job Title : Python Data Engineer

Experience : 4+ Years

Location : Bangalore / Hyderabad (On-site)


Job Summary :

We are seeking a skilled Python Data Engineer to work on cloud-native data platforms and backend services.

The role involves building scalable APIs, working with diverse data systems, and deploying containerized services using modern cloud infrastructure.


Mandatory Skills : Python, AWS, RESTful APIs, Microservices, SQL/PostgreSQL/NoSQL, Docker, Kubernetes, CI/CD (Jenkins/GitLab CI/AWS CodePipeline)


Key Responsibilities :

  • Design, develop, and maintain backend systems using Python.
  • Build and manage RESTful APIs and microservices architectures.
  • Work extensively with AWS cloud services for deployment and data storage.
  • Implement and manage SQL, PostgreSQL, and NoSQL databases.
  • Containerize applications using Docker and orchestrate with Kubernetes.
  • Set up and maintain CI/CD pipelines using Jenkins, GitLab CI, or AWS CodePipeline.
  • Collaborate with teams to ensure scalable and reliable software delivery.
  • Troubleshoot and optimize application performance.


Must-Have Skills :

  • 4+ years of hands-on experience in Python backend development.
  • Strong experience with AWS cloud infrastructure.
  • Proficiency in building microservices and APIs.
  • Good knowledge of relational and NoSQL databases.
  • Experience with Docker and Kubernetes.
  • Familiarity with CI/CD tools and DevOps processes.
  • Strong problem-solving and collaboration skills.
Read more
A leader in telecom, fintech, AI-led marketing automation.

A leader in telecom, fintech, AI-led marketing automation.

Agency job
via Infinium Associate by Toshi Srivastava
Bengaluru (Bangalore)
9 - 15 yrs
₹25L - ₹35L / yr
MERN Stack
skill iconPython
skill iconMongoDB
Spark
Hadoop
+7 more

We are looking for a talented MERN Developer with expertise in MongoDB/MySQL, Kubernetes, Python, ETL, Hadoop, and Spark. The ideal candidate will design, develop, and optimize scalable applications while ensuring efficient source code management and implementing Non-Functional Requirements (NFRs).


Key Responsibilities:

  • Develop and maintain robust applications using MERN Stack (MongoDB, Express.js, React.js, Node.js).
  • Design efficient database architectures (MongoDB/MySQL) for scalable data handling.
  • Implement and manage Kubernetes-based deployment strategies for containerized applications.
  • Ensure compliance with Non-Functional Requirements (NFRs), including source code management, development tools, and security best practices.
  • Develop and integrate Python-based functionalities for data processing and automation.
  • Work with ETL pipelines for smooth data transformations.
  • Leverage Hadoop and Spark for processing and optimizing large-scale data operations.
  • Collaborate with solution architects, DevOps teams, and data engineers to enhance system performance.
  • Conduct code reviews, troubleshooting, and performance optimization to ensure seamless application functionality.


Required Skills & Qualifications:

  • Proficiency in MERN Stack (MongoDB, Express.js, React.js, Node.js).
  • Strong understanding of database technologies (MongoDB/MySQL).
  • Experience working with Kubernetes for container orchestration.
  • Hands-on knowledge of Non-Functional Requirements (NFRs) in application development.
  • Expertise in Python, ETL pipelines, and big data technologies (Hadoop, Spark).
  • Strong problem-solving and debugging skills.
  • Knowledge of microservices architecture and cloud computing frameworks.

Preferred Qualifications:

  • Certifications in cloud computing, Kubernetes, or database management.
  • Experience in DevOps, CI/CD automation, and infrastructure management.
  • Understanding of security best practices in application development.


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore)
4 - 8 yrs
₹7L - ₹24L / yr
skill iconPython
NumPy
pandas
skill iconMachine Learning (ML)

·        Develop and maintain scalable back-end applications using Python frameworks such as Flask/Django/FastAPI.

·        Design, build, and optimize data pipelines for ETL processes using tools like PySpark, Airflow, and other similar technologies.

·        Work with relational and NoSQL databases to manage and process large datasets efficiently.

Collaborate with data scientists to clean, transform, and prepare data for analytics and machine learning models.

Work in a dynamic environment, at the intersection of software development and data engineering.

Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Bengaluru (Bangalore)
6 - 9 yrs
₹15L - ₹30L / yr
skill iconNodeJS (Node.js)
Relational Database (RDBMS)
skill iconReact.js
skill iconAngular (2+)
SQL
+8 more

Role overview


  • Overall 5 to 7 years of experience. Node.js experience is must.
  • At least 3+ years of experience or couple of large-scale products delivered on microservices.
  • Strong design skills on microservices and AWS platform infrastructure.
  • Excellent programming skill in Python, Node.js and Java.
  • Hands on development in rest API’s.
  • Good understanding of nuances of distributed systems, scalability, and availability.


What would you do here


  • To Work as a Backend Developer in developing Cloud Web Applications
  • To be part of the team working on various types of web applications related to Mortgage Finance.
  • Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
  • You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
  • Experienced in Relational and No-SQL databases and scalable design.
  • Experience in solving challenging problems by developing elegant, maintainable code.
  • Delivered rapid iterations of software based on user feedback and metrics.
  • Help the team make key decisions on our product and technology direction.
  • You actively contribute to the adoption of frameworks, standards, and new technologies.
Read more
PGAGI
Javeriya Shaik
Posted by Javeriya Shaik
Remote, Bengaluru (Bangalore)
2 - 3 yrs
₹6L - ₹7L / yr
Artificial Intelligence (AI)
Large Language Models (LLM) tuning
Retrieval Augmented Generation (RAG)
skill iconPython
Natural Language Processing (NLP)
+1 more

Key Responsibilities

  • Experience working with python, LLM, Deep Learning, NLP, etc..
  • Utilize GitHub for version control, including pushing and pulling code updates.
  • Work with Hugging Face and OpenAI platforms for deploying models and exploring open-source AI models.
  • Engage in prompt engineering and the fine-tuning process of AI models.

Requirements

  • Proficiency in Python programming.
  • Experience with GitHub and version control workflows.
  • Familiarity with AI platforms such as Hugging Face and OpenAI.
  • Understanding of prompt engineering and model fine-tuning.
  • Excellent problem-solving abilities and a keen interest in AI technology.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore)
5 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
Generative AI
Langchain
Streamlit
Large Language Models (LLM)
+4 more

Role : Gen AI Developer / Engineer

Location : Bangalore

Experience Required : 6+ Years

Work Mode : Hybrid (2–3 days from office per week)

Contract Duration : 12 Months

Must-Have Skills :

  • Python, Gen AI, Langchain, Streamlit, LLMs.
  • Strong experience building AI/ML-based applications.
  • 2+ years of hands-on experience with LLM development.
  • Solid understanding of RAG (Retrieval-Augmented Generation), embeddings, and LLM training.
  • Proficiency in prompt engineering.
  • Hands-on experience with Azure services: Azure Search, App Services, API Management, Cosmos DB.
  • Familiarity with Azure cloud infrastructure.
  • Basic knowledge of front-end technologies like React.
  • Understanding of software engineering best practices including Git, testing, and CI/CD pipelines.
Read more
hirezyai
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 10 yrs
₹12L - ₹25L / yr
AgaroCD
skill iconKubernetes
skill iconDocker
helm
Terraform
+9 more

Job Summary:

We are seeking a skilled DevOps Engineer to design, implement, and manage CI/CD pipelines, containerized environments, and infrastructure automation. The ideal candidate should have hands-on experience with ArgoCD, Kubernetes, and Docker, along with a deep understanding of cloud platforms and deployment strategies.

Key Responsibilities:

  • CI/CD Implementation: Develop, maintain, and optimize CI/CD pipelines using ArgoCD, GitOps, and other automation tools.
  • Container Orchestration: Deploy, manage, and troubleshoot containerized applications using Kubernetes and Docker.
  • Infrastructure as Code (IaC): Automate infrastructure provisioning with Terraform, Helm, or Ansible.
  • Monitoring & Logging: Implement and maintain observability tools like Prometheus, Grafana, ELK, or Loki.
  • Security & Compliance: Ensure best security practices in containerized and cloud-native environments.
  • Cloud & Automation: Manage cloud infrastructure on AWS, Azure, or GCP with automated deployments.
  • Collaboration: Work closely with development teams to optimize deployments and performance.

Required Skills & Qualifications:

  • Experience: 5+ years in DevOps, Site Reliability Engineering (SRE), or Infrastructure Engineering.
  • Tools & Tech: Strong knowledge of ArgoCD, Kubernetes, Docker, Helm, Terraform, and CI/CD pipelines.
  • Cloud Platforms: Experience with AWS, GCP, or Azure.
  • Programming & Scripting: Proficiency in Python, Bash, or Go.
  • Version Control: Hands-on with Git and GitOps workflows.
  • Networking & Security: Knowledge of ingress controllers, service mesh (Istio/Linkerd), and container security best practices.

Nice to Have:

  • Experience with Kubernetes Operators, Kustomize, or FluxCD.
  • Exposure to serverless architectures and multi-cloud deployments.
  • Certifications in CKA, AWS DevOps, or similar.


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort