Cutshort logo
Python Jobs in Mumbai

50+ Python Jobs in Mumbai | Python Job openings in Mumbai

Apply to 50+ Python Jobs in Mumbai on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.


Read more
HaystackAnalytics
Careers Hr
Posted by Careers Hr
Navi Mumbai
1 - 4 yrs
₹6L - ₹12L / yr
skill iconRust
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconData Science
+2 more

Position – Python Developer

Location – Navi Mumbai


Who are we

Based out of IIT Bombay, HaystackAnalytics is a HealthTech company creating clinical genomics products, which enable diagnostic labs and hospitals to offer accurate and personalized diagnostics. Supported by India's most respected science agencies (DST, BIRAC, DBT), we created and launched a portfolio of products to offer genomics in infectious diseases. Our genomics-based diagnostic solution for Tuberculosis was recognized as one of the top innovations supported by BIRAC in the past 10 years, and was launched by the Prime Minister of India in the BIRAC Showcase event in Delhi, 2022.


Objectives of this Role:

  • Design and implement efficient, scalable backend services using Python.
  • Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions.
  • Build APIs, services, and scripts to support data processing pipelines and front-end applications.
  • Automate recurring tasks and ensure robust integration with cloud services.
  • Maintain high standards of software quality and performance using clean coding principles and testing practices.
  • Collaborate within the team to upskill and unblock each other for faster and better outcomes.





Primary Skills – Python Development

  • Proficient in Python 3 and its ecosystem
  • Frameworks: Flask / Django / FastAPI
  • RESTful API development
  • Understanding of OOPs and SOLID design principles
  • Asynchronous programming (asyncio, aiohttp)
  • Experience with task queues (Celery, RQ)
  • Rust programming experience for systems-level or performance-critical components

Testing & Automation

  • Unit Testing: PyTest / unittest
  • Automation tools: Ansible / Terraform (good to have)
  • CI/CD pipelines

DevOps & Cloud

  • Docker, Kubernetes (basic knowledge expected)
  • Cloud platforms: AWS / Azure / GCP
  • GIT and GitOps workflows
  • Familiarity with containerized deployment & serverless architecture

Bonus Skills

  • Data handling libraries: Pandas / NumPy
  • Experience with scripting: Bash / PowerShell
  • Functional programming concepts
  • Familiarity with front-end integration (REST API usage, JSON handling)

 Other Skills

  • Innovation and thought leadership
  • Interest in learning new tools, languages, workflows
  • Strong communication and collaboration skills
  • Basic understanding of UI/UX principles


To know more about ushttps://haystackanalytics.in




Read more
Wissen Technology

at Wissen Technology

4 recruiters
Poornima Varadarajan
Posted by Poornima Varadarajan
Mumbai
1 - 8 yrs
₹8L - ₹20L / yr
Object Oriented Programming (OOPs)
Data Structures
Algorithms
skill iconPython

Experience in Python (Only Backend), Data structures, Oops, Algorithms, Django, NumPy etc.

• Good understanding of writing Unit Tests using PYTest.

• Good understanding of parsing XML’s and handling files using Python.

• Good understanding with Databases/SQL, procedures and query tuning.

• Service Design Concepts, OO and Functional Development concepts.

• Agile Development Methodologies.

• Strong oral and written communication skills.

• Excellent interpersonal skills and professional approach Skills desired.


Read more
LearnTube.ai

at LearnTube.ai

2 candid answers
Vidhi Solanki
Posted by Vidhi Solanki
Mumbai
2 - 5 yrs
₹8L - ₹18L / yr
skill iconPython
FastAPI
skill iconAmazon Web Services (AWS)
skill iconMongoDB
CI/CD
+5 more

Role Overview:


As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.


What You'll Do:


At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As a Backend Engineer, your roles and responsibilities will include:

  • Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
  • Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
  • Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
  • Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
  • Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
  • Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
  • Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
  • Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.


What makes you a great fit?


Must-Haves:

  • 2+ yrs Python back-end experience (FastAPI)
  • Strong with Docker & container orchestration
  • Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
  • SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals

Nice-to-Haves

  • k8s at scale, Terraform,
  • Experience with AI/ML inference services (LLMs, vector DBs)
  • Go / Rust for high-perf services
  • Observability: Prometheus, Grafana, OpenTelemetry

About Us: 


At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:

  • AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
  • Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.

Meet the Founders: 


LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.


Why Work With Us? 


At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:

  • Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
  • Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
  • Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
  • Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
  • Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
  • Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.


Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Rutuja Patil
Posted by Rutuja Patil
Mumbai
4 - 10 yrs
Best in industry
skill iconJava
J2EE
Hibernate (Java)
skill iconSpring Boot
Spring MVC
+2 more

Company Name – Wissen Technology

Group of companies in India – Wissen Technology & Wissen Infotech

Work Location - Senior Backend Developer – Java (with Python Exposure)- Mumbai


Experience - 4 to 10 years


Kindly revert over mail if you are interested.


Java Developer – Job Description


We are seeking a Senior Backend Developer with strong expertise in Java (Spring Boot) and working knowledge of Python. In this role, Java will be your primary development language, with Python used for scripting, automation, or selected service modules. You’ll be part of a collaborative backend team building scalable and high-performance systems.


Key Responsibilities


  • Design and develop robust backend services and APIs primarily using Java (Spring Boot)
  • Contribute to Python-based components where needed for automation, scripting, or lightweight services
  • Build, integrate, and optimize RESTful APIs and microservices
  • Work with relational and NoSQL databases
  • Write unit and integration tests (JUnit, PyTest)
  • Collaborate closely with DevOps, QA, and product teams
  • Participate in architecture reviews and design discussions
  • Help maintain code quality, organization, and automation


Required Skills & Qualifications

  • 4 to 10 years of hands-on Java development experience
  • Strong experience with Spring Boot, JPA/Hibernate, and REST APIs
  • At least 1–2 years of hands-on experience with Python (e.g., for scripting, automation, or small services)
  • Familiarity with Python frameworks like Flask or FastAPI is a plus
  • Experience with SQL/NoSQL databases (e.g., PostgreSQL, MongoDB)
  • Good understanding of OOPdesign patterns, and software engineering best practices
  • Familiarity with DockerGit, and CI/CD pipelines


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
GoQuest Media Ventures Pvt Ltd
Mumbai
1 - 5 yrs
₹8L - ₹10L / yr
MERN Stack
Fullstack Developer
skill iconPython
Mobile App Development
Web Development
+7 more

ROLES AND RESPONSIBILITIES


As a Full Stack Developer at GoQuest Media, you will play a key role in building and maintaining

web applications that deliver seamless user experiences for our global clients. From

brainstorming features with the team to executing back-end logic, you will be involved in every

aspect of our application development process.

You will be working with modern technologies like NodeJS, ReactJS, NextJS, and Tailwind CSS

to create performant, scalable applications. Your role will span both front-end and back-end

development as you build efficient and dynamic solutions to meet the company’s and users’

needs.


What will you be accountable for?

● End-to-End Development:

● Design and develop highly scalable and interactive web applications from scratch.

● Take ownership of both front-end (ReactJS, NextJS, Tailwind CSS) and back-end

(NodeJS) development processes.

● Feature Implementation:

● Work closely with designers and product managers to translate ideas into highly

interactive and responsive interfaces.

● Maintenance and Debugging:

● Ensure applications are optimized for performance, scalability, and reliability.

● Perform regular maintenance, debugging, and testing of existing apps to ensure

they remain in top shape.

● Collaboration:

● Collaborate with cross-functional teams, including designers, product managers,

and stakeholders, to deliver seamless and robust applications.

● Innovation:

● Stay updated with the latest trends and technologies to suggest and implement

improvements in the development process.


Tech Stack

● Front-end: ReactJS, NextJS, Tailwind CSS

● Back-end: NodeJS, ExpressJS

● Database: MongoDB (preferred), MySQL

● Version Control: Git

● Tools: Webpack, Docker (optional but a plus)


Preferred Location

This role is based out of our Andheri Office, Mumbai.


Growth Opportunities for You

● Lead exciting web application projects end-to-end and own key product initiatives.

● Develop cutting-edge apps used by leading media clients around the globe.

● Gain experience working in a high-growth company in the media and tech industry.

● Potential to grow into a team lead role.


Who Should Apply?

● Individuals with a passion for coding and web technologies.

● Minimum 3-5 years of experience in full-stack development using NodeJS, ReactJS,

NextJS, and Tailwind CSS.

● Strong understanding of both front-end and back-end development and ability to

write efficient, reusable, and scalable code.

● Familiarity with databases like MongoDB and MySQL.

● Experience with CI/CD pipelines and cloud infrastructure (AWS, Google Cloud) is a

plus.

● Team players with excellent communication skills and the ability to work in a

fast-paced environment.


Who Should Not Apply?

● If you're not comfortable with both front-end and back-end development.

● If you don’t enjoy problem-solving or tackling complex development challenges.

● If working in a dynamic, evolving environment doesn’t appeal to you.

Read more
Teknobuilt Solutions Pvt Ltd
Navi Mumbai, Delhi
3 - 6 yrs
₹8L - ₹10L / yr
skill iconPython
Selenium
skill iconJava
Agile/Scrum
QTP
+1 more

Teknobuilt is an innovative construction technology company accelerating Digital and AI platform to help all aspects of program management and execution for workflow automation, collaborative manual tasks and siloed systems. Our platform has received innovation awards and grants in Canada, UK and S. Korea and we are at the frontiers of solving key challenges in the built environment and digital health, safety and quality.

Teknobuilt's vision is helping the world build better- safely, smartly and sustainably. We are on a mission to modernize construction by bringing Digitally Integrated Project Execution System - PACE and expert services for midsize to large construction and infrastructure projects. PACE is an end-to-end digital solution that helps in Real Time Project Execution, Health and Safety, Quality and Field management for greater visibility and cost savings. PACE enables digital workflows, remote working, AI based analytics to bring speed, flow and surety in project delivery. Our platform has received recognition globally for innovation and we are experiencing a period of significant growth for our solutions.

 

Job Responsibilities

As a Quality Analyst Engineer, you will be expected to:

· Thoroughly analyze project requirements, design specifications, and user stories to understand the scope and objectives.

· Arrange, set up, and configure necessary test environments for effective test case execution.

· Participate in and conduct review meetings to discuss test plans, test cases, and defect statuses.

Execute manual test cases with precision, analyze results, and identify deviations from expected behavior.

· Accurately track, log, prioritize, and manage defects through their lifecycle, ensuring clear communication with developers until resolution.

· Maintain continuous and clear communication with the Test Manager and development team regarding testing progress, roadblocks, and critical findings.

· Develop, maintain, and manage comprehensive test documentation, including:

o Detailed Test Plans

o Well-structured Test Cases for various testing processes

o Concise Summary Reports on test execution and defect status

o Thorough Test Data preparation for test cases

o "Lessons Learned" documents based on testing inputs from previous projects

o "Suggestion Documents" aimed at improving overall software quality

o Clearly defined Test Scenarios

· Clearly report identified bugs to developers with precise steps to reproduce, expected results, and actual results, facilitating efficient defect resolution

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Monika Sekaran
Posted by Monika Sekaran
Mumbai
2 - 8 yrs
Best in industry
skill iconPython
Data Structures
Algorithms
skill iconDjango
Object Oriented Programming (OOPs)
+1 more

Job Description:


• Experience in Python (Only Backend), Data structures, Oops, Algorithms, Django, NumPy etc.

• Notice/Joining of not more than 30 days.

• Only Premium Institute- Tier 1 and Tier 2.

• Hybrid Mode of working.

• Good understanding of writing Unit Tests using PYTest.

• Good understanding of parsing XML’s and handling files using Python.

• Good understanding with Databases/SQL, procedures and query tuning.

• Service Design Concepts, OO and Functional Development concepts.

• Agile Development Methodologies.

• Strong oral and written communication skills.

• Excellent interpersonal skills and professional approach Skills desired.

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
hirezyai
Aardra Suresh
Posted by Aardra Suresh
Bengaluru (Bangalore), Mumbai
7 - 14 yrs
₹15L - ₹30L / yr
skill iconPython
AWS Lambda
skill iconDocker
API
S3
+4 more

We are seeking a highly skilled Python Backend Developer with strong experience in building Microservices-based architectures and cloud-native server less solutions on AWS. The ideal candidate will be responsible for designing, developing, and maintaining scalable backend systems, ensuring high performance and responsiveness to requests from front-end applications and third-party systems.

 

Key Responsibilities:

  • Design and develop robust backend services and RESTful APIs using Python (FastAPI, Flask, or Django)
  • Build and deploy microservices that are scalable, loosely coupled, and independently deployable
  • Develop and manage serverless applications using AWS LambdaAPI GatewayDynamoDBS3SNSSQS, and Step Functions
  • Implement event-driven architectures and data processing pipelines
  • Collaborate with front-end developers, DevOps, and product teams to deliver high-quality software
  • Ensure code quality through unit testingintegration testing, and code reviews
  • Automate deployments using CI/CD pipelines and Infrastructure as Code (IaC) tools like CloudFormation or Terraform
  • Monitor, debug, and optimize backend systems for performance and scalability

 

Required Skills & Experience:

  • 7+ years of backend development experience using Python
  • Strong experience in designing and implementing microservices
  • Hands-on experience with AWS Serverless services: Lambda, API Gateway, S3, DynamoDB, SQS, SNS, etc.
  • Proficient in RESTful API design, JSON, and OpenAPI/Swagger specifications
  • Experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI)
  • Knowledge of CI/CD tools (e.g., GitHub Actions, Jenkins, CodePipeline)
  • Familiarity with Docker and containerized deployments
  • Strong understanding of software design patterns, clean code practices, and Agile methodologies

 

Nice to Have:

  • Experience with GraphQL or gRPC
  • Exposure to monitoring/logging tools (e.g., CloudWatch, ELK, Prometheus)
  • Knowledge of security best practices in API and cloud development
  • Familiarity with data streaming using Kafka or Kinesis


Read more
Wissen Technology

at Wissen Technology

4 recruiters
VenkataRamanan S
Posted by VenkataRamanan S
Mumbai
4 - 8 yrs
₹15L - ₹25L / yr
skill iconPython
SQL
ETL

What We’re Looking For:

  • Strong experience in Python (3+ years).
  • Hands-on experience with any database (SQL or NoSQL).
  • Experience with frameworks like Flask, FastAPI, or Django.
  • Knowledge of ORMs, API development, and unit testing.
  • Familiarity with Git and Agile methodologies.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
hirezyai
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 10 yrs
₹12L - ₹25L / yr
AgaroCD
skill iconKubernetes
skill iconDocker
helm
Terraform
+9 more

Job Summary:

We are seeking a skilled DevOps Engineer to design, implement, and manage CI/CD pipelines, containerized environments, and infrastructure automation. The ideal candidate should have hands-on experience with ArgoCD, Kubernetes, and Docker, along with a deep understanding of cloud platforms and deployment strategies.

Key Responsibilities:

  • CI/CD Implementation: Develop, maintain, and optimize CI/CD pipelines using ArgoCD, GitOps, and other automation tools.
  • Container Orchestration: Deploy, manage, and troubleshoot containerized applications using Kubernetes and Docker.
  • Infrastructure as Code (IaC): Automate infrastructure provisioning with Terraform, Helm, or Ansible.
  • Monitoring & Logging: Implement and maintain observability tools like Prometheus, Grafana, ELK, or Loki.
  • Security & Compliance: Ensure best security practices in containerized and cloud-native environments.
  • Cloud & Automation: Manage cloud infrastructure on AWS, Azure, or GCP with automated deployments.
  • Collaboration: Work closely with development teams to optimize deployments and performance.

Required Skills & Qualifications:

  • Experience: 5+ years in DevOps, Site Reliability Engineering (SRE), or Infrastructure Engineering.
  • Tools & Tech: Strong knowledge of ArgoCD, Kubernetes, Docker, Helm, Terraform, and CI/CD pipelines.
  • Cloud Platforms: Experience with AWS, GCP, or Azure.
  • Programming & Scripting: Proficiency in Python, Bash, or Go.
  • Version Control: Hands-on with Git and GitOps workflows.
  • Networking & Security: Knowledge of ingress controllers, service mesh (Istio/Linkerd), and container security best practices.

Nice to Have:

  • Experience with Kubernetes Operators, Kustomize, or FluxCD.
  • Exposure to serverless architectures and multi-cloud deployments.
  • Certifications in CKA, AWS DevOps, or similar.


Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sruthy VS
Posted by Sruthy VS
Bengaluru (Bangalore), Mumbai
4 - 8 yrs
Best in industry
Snow flake schema
ETL
SQL
skill iconPython
  • Strong Snowflake Cloud database experience Database developer.
  • Knowledge of Spark and Databricks is desirable.
  • Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture 
  • Familiar with technologies relevant to data lakes such as Snowflake
  • Candidate should have strong ETL & database design/modelling skills. 
  • Experience creating data pipelines
  • Strong SQL skills and debugging knowledge and Performance Tuning exp.
  • Experience with Databricks / Azure is add on /good to have . 
  • Experience working with global teams and global application environments
  • Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired

 

 

 

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by susmitha o
Hyderabad, Mumbai, kolkata, Pune, chennai
4 - 10 yrs
₹7L - ₹20L / yr
skill iconMachine Learning (ML)
MLOps
skill iconPython
NumPy
  • Design and implement cloud solutions, build MLOps on Azure
  • Build CI/CD pipelines orchestration by GitLab CI, GitHub Actions, Circle CI, Airflow or similar tools
  • Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality
  • Data science models testing, validation and tests automation
  • Deployment of code and pipelines across environments
  • Model performance metrics
  • Service performance metrics
  • Communicate with a team of data scientists, data engineers and architect, document the processes


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Mumbai, Kolkata
4 - 10 yrs
₹7L - ₹25L / yr
skill iconPython
skill iconMachine Learning (ML)
skill iconFlask
Artificial Intelligence (AI)

3+ years’ experience as Python Developer / Designer and Machine learning 2. Performance Improvement understanding and able to write effective, scalable code 3. security and data protection solutions 4. Expertise in at least one popular Python framework (like Django, Flask or Pyramid) 5. Knowledge of object-relational mapping (ORM) 6. Familiarity with front-end technologies (like JavaScript and HTML5

Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, PAn india
5 - 10 yrs
₹10L - ₹25L / yr
Test Automation
Selenium
skill iconJava
skill iconPython
skill iconJavascript

Test Automation Engineer Job Description

A Test Automation Engineer is responsible for designing, developing, and implementing automated testing solutions to ensure the quality and reliability of software applications. Here's a breakdown of the job:


Key Responsibilities

- Test Automation Framework: Design and develop test automation frameworks using tools like Selenium, Appium, or Cucumber.

- Automated Test Scripts: Create and maintain automated test scripts to validate software functionality, performance, and security.

- Test Data Management: Develop and manage test data, including data generation, masking, and provisioning.

- Test Environment: Set up and maintain test environments, including configuration and troubleshooting.

- Collaboration: Work with cross-functional teams, including development, QA, and DevOps to ensure seamless integration of automated testing.


Essential Skills

- Programming Languages: Proficiency in programming languages like Java, Python, or C#.

- Test Automation Tools: Experience with test automation tools like Selenium,.

- Testing Frameworks: Knowledge of testing frameworks like TestNG, JUnit, or PyUnit.

- Agile Methodologies: Familiarity with Agile development methodologies and CI/CD pipelines.

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
HaystackAnalytics
Careers Hr
Posted by Careers Hr
Navi Mumbai
0 - 5 yrs
₹3L - ₹8L / yr
skill iconPython
Algorithms
skill iconFlask
skill iconDjango
skill iconMongoDB

Position – Python Developer

Location – Navi Mumbai


Who are we

Based out of IIT Bombay, HaystackAnalytics is a HealthTech company creating clinical genomics products, which enable diagnostic labs and hospitals to offer accurate and personalized diagnostics. Supported by India's most respected science agencies (DST, BIRAC, DBT), we created and launched a portfolio of products to offer genomics in infectious diseases. Our genomics-based diagnostic solution for Tuberculosis was recognized as one of the top innovations supported by BIRAC in the past 10 years, and was launched by the Prime Minister of India in the BIRAC Showcase event in Delhi, 2022.


Objectives of this Role:

  • Design and implement efficient, scalable backend services using Python.
  • Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions.
  • Build APIs, services, and scripts to support data processing pipelines and front-end applications.
  • Automate recurring tasks and ensure robust integration with cloud services.
  • Maintain high standards of software quality and performance using clean coding principles and testing practices.
  • Collaborate within the team to upskill and unblock each other for faster and better outcomes.



Primary Skills – Python Development

  • Proficient in Python 3 and its ecosystem
  • Frameworks: Flask / Django / FastAPI
  • RESTful API development
  • Understanding of OOPs and SOLID design principles
  • Asynchronous programming (asyncio, aiohttp)
  • Experience with task queues (Celery, RQ)

Database & Storage

  • Relational Databases: PostgreSQL / MySQL
  • NoSQL: MongoDB / Redis / Cassandra
  • ORM Tools: SQLAlchemy / Django ORM

Testing & Automation

  • Unit Testing: PyTest / unittest
  • Automation tools: Ansible / Terraform (good to have)
  • CI/CD pipelines

DevOps & Cloud

  • Docker, Kubernetes (basic knowledge expected)
  • Cloud platforms: AWS / Azure / GCP
  • GIT and GitOps workflows
  • Familiarity with containerized deployment & serverless architecture

Bonus Skills

  • Data handling libraries: Pandas / NumPy
  • Experience with scripting: Bash / PowerShell
  • Functional programming concepts
  • Familiarity with front-end integration (REST API usage, JSON handling)

 Other Skills

  • Innovation and thought leadership
  • Interest in learning new tools, languages, workflows
  • Strong communication and collaboration skills
  • Basic understanding of UI/UX principles


To know more about ushttps://haystackanalytics.in


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Mumbai, Chennai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
aws
Amazon Redshift
+1 more

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
Cere Labs
Devesh Rajadhyax
Posted by Devesh Rajadhyax
Mumbai
2 - 4 yrs
₹6L - ₹9L / yr
skill iconPython
skill iconReact.js
skill iconSpring Boot

Job Title: Team Leader – Full Stack & GenAI Projects

Location: Mumbai, Work From Office

Reporting To: Project Manager

Experience: 2–3 years

Employment Type: Full-time

Job Summary

We are looking for a motivated and responsible Team Leader to manage the delivery of full stack development projects with a focus on Generative AI applications. You will lead a team of 3–5 developers, ensure high-quality deliverables, and collaborate closely with the project manager to meet deadlines and client expectations.

Key Responsibilities

  • Lead the design, development, and deployment of web-based software solutions using modern full stack technologies
  • Guide and mentor a team of 3–5 developers; assign tasks and monitor progress
  • Take ownership of project deliverables and ensure timely, quality outcomes
  • Collaborate with cross-functional teams including UI/UX, DevOps, and QA
  • Apply problem-solving skills to address technical challenges and design scalable solutions
  • Contribute to the development of GenAI-based modules and features
  • Ensure adherence to coding standards, version control, and agile practices

Required Skills & Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, or related field
  • 2–3 years of experience in full stack development (front-end + back-end)
  • Proficiency in one or more tech stacks (e.g., React/Angular + Node.js/Java/Python)
  • Solid understanding of databases, REST APIs, and version control (Git)
  • Strong problem-solving skills and ability to work independently
  • Excellent programming, debugging, and team collaboration skills
  • Exposure to Generative AI frameworks or APIs is a strong plus
  • Willingness to work from office full-time

Nice to Have

  • Experience in leading or mentoring small teams
  • Familiarity with cloud platforms (AWS, GCP, or Azure)
  • Knowledge of CI/CD practices and Agile methodologies

About us

Cere Labs is a Mumbai based company working in the field of Artificial Intelligence. It is a product company that utilizes the latest technologies such as Python, Redis, neo4j, MVC, Docker, Kubernetes to build its AI platform. Cere Labs’ clients are primarily from the Banking and Finance domain in India and US. The company has a great environment for its employees to learn and grow in technology.

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Softlink Global Pvt. Ltd.
Mumbai
3 - 4 yrs
Best in industry
skill iconPython
Selenium

Company Overview:

Softlink Global is the global leading software provider for Freight Forwarding, Logistics, and Supply Chain industry. Our comprehensive product portfolio includes superior technology solutions for the strategic and operational aspects of the logistics & freight forwarding business. At present, Softlink caters to more 5,000+ logistics & freight companies spread across 45+ countries. Our global operations are handled by more than 300+ highly experienced employees.


Company Website - https://softlinkglobal.com/


Role Overview:

Are you a testing ninja with a knack for Selenium Python Hybrid Frameworks? LogiBUILD is calling for an Automation Tester with 2–3 years of magic in test automation, Jenkins, GitHub, and all things QA! You’ll be the hero ensuring our software is rock-solid, crafting automated test scripts, building smart frameworks, and keeping everything running smooth with CI and version control. If “breaking things to make them unbreakable” sounds like your jam, we’ve got the perfect spot for you! 


Key Responsibilities:

  • Automation Framework Development: Design, develop, and maintain Selenium-based automated test scripts using Python, focusing on creating a hybrid automation framework to handle a variety of test scenarios.
  • Framework Optimization & Maintenance: Continuously optimize and refactor automation frameworks for performance, reliability, and maintainability. Provide regular updates and improvements to automation processes.
  • Test Automation & Execution: Execute automated tests for web applications, analyze results, and report defects, collaborating closely with QA engineers and developers for continuous improvement.
  • Version Control Management: Manage source code repositories using GitHub, including branching, merging, and maintaining proper version control processes for test scripts and frameworks.
  • Collaborative Work: Work closely with developers, QA, and other team members to ensure smooth collaboration between manual and automated testing efforts. Help in integrating automated tests into the overall SDLC/STLC.
  • Documentation: Document the test strategy, framework design, and test execution reports to ensure clear communication and knowledge sharing across the team.
  • Test Automation Knowledge: Experience in test automation for web-based applications, including functional, regression, and integration tests.
  • Debugging & Troubleshooting: Strong problem-solving skills to debug and troubleshoot issues in automation scripts, Jenkins pipelines, and test environments.


Requirements:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 2-3 years of experience in similar role, with hands on experience of mentions tools & frameworks.
  • Certifications in Selenium, Python, or automation testing.
  • Familiarity with Agile or Scrum methodologies.
  • Excellent problem-solving and communication skills.


Read more
Banking Client

Banking Client

Agency job
via Rapidsoft Technologies by Sarita Jena
Navi Mumbai
12 - 15 yrs
₹25L - ₹30L / yr
skill iconKubernetes
Ansible
Terraform
IaC
skill icongrafana
+7 more

Sr. Devops Engineer – 12+ Years of Experience

 

Key Responsibilities:

Design, implement, and manage CI/CD pipelines for seamless deployments.

Optimize cloud infrastructure (AWS, Azure, GCP) for high availability and scalability.

Manage and automate infrastructure using Terraform, Ansible, or CloudFormation.

Deploy and maintain Kubernetes, Docker, and container orchestration tools.

Ensure security best practices in infrastructure and deployments.

Implement logging, monitoring, and alerting solutions (Prometheus, Grafana, ELK, Datadog).

Troubleshoot and resolve system and network issues efficiently.

Collaborate with development, QA, and security teams to streamline DevOps processes.

Required Skills:

Strong expertise in CI/CD tools (Jenkins, GitLab CI/CD, ArgoCD).

Hands-on experience with cloud platforms (AWS, GCP, or Azure).

Proficiency in Infrastructure as Code (IaC) tools (Terraform, Ansible).

Experience with containerization and orchestration (Docker, Kubernetes).

Knowledge of networking, security, and monitoring tools.

Proficiency in scripting languages (Python, Bash, Go).

Strong troubleshooting and performance tuning skills.

Preferred Qualifications:

Certifications in AWS, Kubernetes, or DevOps.

Experience with service mesh, GitOps, and DevSecOps.

Read more
WeAssemble
Mumbai
2 - 7 yrs
₹3L - ₹720L / yr
skill iconPython


Junior Python Developer – Web Scraping

Mumbai, Maharashtra

Work Type: Full Time


We’re looking for a Junior Python Developer who is passionate about web scraping and data extraction. If you love automating the web, navigating anti-bot mechanisms, and writing clean, efficient code, this role is for you!


Key Responsibilities:

  • Design and build robust web scraping scripts using Python.
  • Work with tools like SeleniumBeautifulSoupScrapy, and Playwright.
  • Handle challenges like dynamic content, captchas, IP blocking, and rate limiting.
  • Ensure data accuracy, structure, and cleanliness during extraction.
  • Optimize scraping scripts for performance and scale.
  • Collaborate with the team to align scraping outputs with project goals.


Requirements:

  • 6 months to 2 years of experience in web scraping using Python.
  • Hands-on with requests, Selenium, BeautifulSoup, Scrapy, etc.
  • Strong understanding of HTML, DOM, and browser behavior.
  • Good coding practices and ability to write clean, maintainable code.
  • Strong communication skills and ability to explain scraping strategies clearly.
  • Based in Mumbai and ready to join immediately.


Nice to Have:

  • Familiarity with headless browsers, proxy handling, and rotating user agents.
  • Experience storing scraped data in JSON, CSV, or databases.
  • Understanding of anti-bot protection techniques and how to bypass them.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Ammar Lokhandwala
Posted by Ammar Lokhandwala
Mumbai, Bengaluru (Bangalore)
3 - 12 yrs
Best in industry
skill iconPython
Large Language Models (LLM) tuning
Natural Language Processing (NLP)
Generative AI
skill iconMachine Learning (ML)
+1 more

We are looking for a Senior AI/ML Engineer with expertise in Generative AI (GenAI) integrations, APIs, and Machine Learning (ML) algorithms who should have strong hands-on experience in Python and statistical and predictive modeling.


Key Responsibilities:

• Develop and integrate GenAI solutions using APIs and custom models.

• Design, implement, and optimize ML algorithms for predictive modeling and data-driven insights.

• Leverage statistical techniques to improve model accuracy and performance.

• Write clean, well-documented, and testable code while adhering to

coding standards and best practices.


Required Skills:

• 4+ years of experience in AI/ML, with a strong focus on GenAI integrations and APIs.

• Proficiency in Python, including libraries like TensorFlow, PyTorch, Scikit-learn, and Pandas.

• Strong expertise in statistical modeling and ML algorithms (Regression, Classification, Clustering, NLP, etc.).

• Hands-on experience with RESTful APIs and AI model deployment.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Hanisha Pralayakaveri
Posted by Hanisha Pralayakaveri
Bengaluru (Bangalore), Mumbai
5 - 9 yrs
Best in industry
skill iconPython
skill iconAmazon Web Services (AWS)
PySpark
Data engineering

Job Description: Data Engineer 

Position Overview:

Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.

· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).

· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.

· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.

· Ensure data quality and consistency by implementing validation and governance practices.

· Work on data security best practices in compliance with organizational policies and regulations.

· Automate repetitive data engineering tasks using Python scripts and frameworks.

· Leverage CI/CD pipelines for deployment of data workflows on AWS.

Read more
TechMynd Consulting

at TechMynd Consulting

2 candid answers
Suraj N
Posted by Suraj N
Bengaluru (Bangalore), Gurugram, Mumbai
4 - 8 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconPostgreSQL
skill iconPython
Apache
skill iconAmazon Web Services (AWS)
+5 more

Senior Data Engineer


Location: Bangalore, Gurugram (Hybrid)


Experience: 4-8 Years


Type: Full Time | Permanent


Job Summary:


We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.


This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.


Key Responsibilities:


PostgreSQL & Data Modeling


· Design and optimize complex SQL queries, stored procedures, and indexes


· Perform performance tuning and query plan analysis


· Contribute to schema design and data normalization


Data Migration & Transformation


· Migrate data from multiple sources to cloud or ODS platforms


· Design schema mapping and implement transformation logic


· Ensure consistency, integrity, and accuracy in migrated data


Python Scripting for Data Engineering


· Build automation scripts for data ingestion, cleansing, and transformation


· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)


· Maintain reusable script modules for operational pipelines


Data Orchestration with Apache Airflow


· Develop and manage DAGs for batch/stream workflows


· Implement retries, task dependencies, notifications, and failure handling


· Integrate Airflow with cloud services, data lakes, and data warehouses


Cloud Platforms (AWS / Azure / GCP)


· Manage data storage (S3, GCS, Blob), compute services, and data pipelines


· Set up permissions, IAM roles, encryption, and logging for security


· Monitor and optimize cost and performance of cloud-based data operations


Data Marts & Analytics Layer


· Design and manage data marts using dimensional models


· Build star/snowflake schemas to support BI and self-serve analytics


· Enable incremental load strategies and partitioning


Modern Data Stack Integration


· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka


· Support modular pipeline design and metadata-driven frameworks


· Ensure high availability and scalability of the stack


BI & Reporting Tools (Power BI / Superset / Supertech)


· Collaborate with BI teams to design datasets and optimize queries


· Support development of dashboards and reporting layers


· Manage access, data refreshes, and performance for BI tools




Required Skills & Qualifications:


· 4–6 years of hands-on experience in data engineering roles


· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)


· Advanced Python scripting skills for automation and ETL


· Proven experience with Apache Airflow (custom DAGs, error handling)


· Solid understanding of cloud architecture (especially AWS)


· Experience with data marts and dimensional data modeling


· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)


· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI


· Version control (Git) and CI/CD pipeline knowledge is a plus


· Excellent problem-solving and communication skills

Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Hyderabad, Pune
4 - 10 yrs
₹10L - ₹24L / yr
skill iconJava
Artificial Intelligence (AI)
Automation
IDX
skill iconSpring Boot
+4 more

Job Title : Senior Backend Engineer – Java, AI & Automation

Experience : 4+ Years

Location : Any Cognizant location (India)

Work Mode : Hybrid

Interview Rounds :

  1. Virtual
  2. Face-to-Face (In-person)

Job Description :

Join our Backend Engineering team to design and maintain services on the Intuit Data Exchange (IDX) platform.

You'll work on scalable backend systems powering millions of daily transactions across Intuit products.


Key Qualifications :

  • 4+ years of backend development experience.
  • Strong in Java, Spring framework.
  • Experience with microservices, databases, and web applications.
  • Proficient in AWS and cloud-based systems.
  • Exposure to AI and automation tools (Workato preferred).
  • Python development experience.
  • Strong communication skills.
  • Comfortable with occasional US shift overlap.
Read more
Wama Technology

at Wama Technology

2 candid answers
HR Wama
Posted by HR Wama
Mumbai
5 - 7 yrs
₹13L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconPython
skill iconAmazon Web Services (AWS)

Job Title: Fullstack Developer

Experience Level: 5+ Years

Location: Borivali, Mumbai

About the Role:

We are seeking a talented and experienced Fullstack Developer to join our dynamic engineering team. The ideal candidate will have at least 5 years of hands-on experience in building scalable web applications using modern technologies. You will be responsible for developing and maintaining both front-end and back-end components, ensuring high performance and responsiveness to requests from the front-end.

Key Responsibilities:

  • Design, develop, test, and deploy scalable web applications using Node.js, React, and Python.
  • Build and maintain APIs and microservices that support high-volume traffic and data.
  • Develop front-end components and user interfaces using React.js.
  • Leverage AWS services for deploying and managing applications in a cloud environment.
  • Collaborate with cross-functional teams including UI/UX designers, product managers, and QA engineers.
  • Participate in code reviews and ensure adherence to best practices in software development.
  • Troubleshoot, debug and upgrade existing systems.
  • Continuously explore and implement new technologies to maximize development efficiency.

Required Skills & Qualifications:

  • 5+ years of experience in fullstack development.
  • Strong proficiency in Node.jsReact.js, and Python.
  • Hands-on experience with AWS (e.g., Lambda, EC2, S3, CloudFormation, RDS).
  • Solid understanding of RESTful APIs and web services.
  • Familiarity with DevOps practices and CI/CD pipelines is a plus.
  • Experience working with relational and NoSQL databases.
  • Proficient understanding of code versioning tools, such as Git.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork abilities.

Nice to Have:

  • Experience with serverless architecture.
  • Knowledge of TypeScript.
  • Exposure to containerization (Docker, Kubernetes).
  • Familiarity with agile development methodologies.


Read more
Ketto

at Ketto

1 video
3 recruiters
Sagar Ganatra
Posted by Sagar Ganatra
Mumbai
1 - 3 yrs
₹10L - ₹15L / yr
Tableau
PowerBI
SQL
skill iconPython
Dashboard
+5 more

About the company:


Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level



Role Overview:


Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.


Key Responsibilities


●  Data Strategy & Automation:

○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.

○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.


●  Data Analysis & Insight Generation:

○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.

○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.


●  Reporting & Quality Assurance:

○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.

○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.


●  Collaboration & Strategic Planning:

○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.

○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.

○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.


Required Skills and Qualifications


●  Technical Expertise:

○ Strong background in SQL, Statistics and Maths


●  Analytical & Strategic Mindset:

○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.

○ Experience with statistical analysis, advanced analytics


●  Communication & Collaboration:

○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.

○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.


●  Preferred Experience:

○ Proven experience in advanced analytics roles

○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.


Why Join Ketto?

At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!

Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Hyderabad, Indore, Jaipur, Kolkata
4 - 5 yrs
₹2L - ₹18L / yr
skill iconPython
PySpark

We are looking for a skilled and passionate Data Engineers with a strong foundation in Python programming and hands-on experience working with APIs, AWS cloud, and modern development practices. The ideal candidate will have a keen interest in building scalable backend systems and working with big data tools like PySpark.

Key Responsibilities:

  • Write clean, scalable, and efficient Python code.
  • Work with Python frameworks such as PySpark for data processing.
  • Design, develop, update, and maintain APIs (RESTful).
  • Deploy and manage code using GitHub CI/CD pipelines.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Work on AWS cloud services for application deployment and infrastructure.
  • Basic database design and interaction with MySQL or DynamoDB.
  • Debugging and troubleshooting application issues and performance bottlenecks.

Required Skills & Qualifications:

  • 4+ years of hands-on experience with Python development.
  • Proficient in Python basics with a strong problem-solving approach.
  • Experience with AWS Cloud services (EC2, Lambda, S3, etc.).
  • Good understanding of API development and integration.
  • Knowledge of GitHub and CI/CD workflows.
  • Experience in working with PySpark or similar big data frameworks.
  • Basic knowledge of MySQL or DynamoDB.
  • Excellent communication skills and a team-oriented mindset.

Nice to Have:

  • Experience in containerization (Docker/Kubernetes).
  • Familiarity with Agile/Scrum methodologies.


Read more
Texple Technologies

at Texple Technologies

1 recruiter
Prajakta Mhadgut
Posted by Prajakta Mhadgut
Mumbai
7 - 10 yrs
₹10L - ₹20L / yr
MERN Stack
AWS
skill iconPython

We are looking for a highly experienced and visionary Tech Lead / Solution Architect with deep expertise in the MERN stack and AWS to join our organization. In this role, you will be responsible for providing technical leadership across multiple projects, guiding architecture decisions, and ensuring scalable, maintainable, and high-quality solutions. You will work closely with cross-functional teams to define technical strategies, mentor developers, and drive the successful execution of complex projects. Your leadership, architectural insight, and hands-on development skills will be key to the team’s success and the organization's technological growth.


Responsibilities:

  • You will be responsible for all the technical decisions related to the project.
  • Lead and mentor a team of developers, providing technical guidance and expertise.
  • Collaborate with product managers, business analysts, and other stakeholders.
  • Architect and design technical solutions that align with project goals and industry best practices.
  • Develop and maintain scalable, reliable, and efficient software applications.
  • Conduct code reviews, ensure code quality, and enforce coding standards.
  • Identify technical risks and challenges, and propose solutions to mitigate them.
  • Stay updated with emerging technologies and trends in software development.
  • Collaborate with cross-functional teams to ensure seamless integration of software components.

Requirements:

  • Bachelor's degree / Graduate
  • Proven experience 7-10 years as a Technical Lead or similar role in software development (start-up experience preferred)
  • Strong technical skills in programming languages such as MERN, Python, Postgres, MySQL.
  • Knowledge of cloud technologies (e.g., AWS, Azure, Google Cloud Platform) and microservices architecture.
  • Excellent leadership, communication, and interpersonal skills.
  • Ability to prioritize tasks, manage multiple projects, and work in a fast-paced environment.

Benefits:

  • Competitive salary and benefits package
  • Opportunities for professional growth and development
  • Collaborative and innovative work environment
  • Certifications on us


Joining : Immediate

Location : Malad (West) - Work From Office


This opportunity is for Work From Office.

Apply for this job if your current location is mumbai.

Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune, Mumbai
5 - 11 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconPython
skill iconAngular (2+)

Should have strong hands on experience of 8-10 yrs in Java Development.

Should have strong knowledge of Java 11+, Spring, Spring Boot, Hibernate, Rest Web Services.

Strong Knowledge of J2EE Design Patterns and Microservices design patterns.

Should have strong hand on knowledge of SQL / PostGres DB. Good to have exposure to Nosql DB.

Should have strong knowldge of AWS services (Lambda, EC2, RDS, API Gateway, S3, Could front, Airflow.

Good to have Python ,PySpark as a secondary Skill

Should have ggod knowledge of CI CD pipleline.

Should be strong in wiriting unit test cases, debug Sonar issues.

Should be able to lead/guide team of junior developers

Should be able to collab with BA and solution architects to create HLD and LLD documents

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai
5 - 10 yrs
Best in industry
skill iconPython
SQL
Databases
Data engineering
skill iconAmazon Web Services (AWS)

Job Description: Data Engineer 

Position Overview:

Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.

· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).

· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.

· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.

· Ensure data quality and consistency by implementing validation and governance practices.

· Work on data security best practices in compliance with organizational policies and regulations.

· Automate repetitive data engineering tasks using Python scripts and frameworks.

· Leverage CI/CD pipelines for deployment of data workflows on AWS.

 

Required Skills and Qualifications

· Professional Experience: 5+ years of experience in data engineering or a related field.

· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.

· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:

· AWS Glue for ETL/ELT.

· S3 for storage.

· Redshift or Athena for data warehousing and querying.

· Lambda for serverless compute.

· Kinesis or SNS/SQS for data streaming.

· IAM Roles for security.

· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.

· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.

· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.

· Version Control: Proficient with Git-based workflows.

· Problem Solving: Excellent analytical and debugging skills.

 

Optional Skills

· Knowledge of data modeling and data warehouse design principles.

· Experience with data visualization tools (e.g., Tableau, Power BI).

· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).

· Exposure to other programming languages like Scala or Java.

 

Education

· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

 

Why Join Us?

· Opportunity to work on cutting-edge AWS technologies.

· Collaborative and innovative work environment.

 

 

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
2 - 4 yrs
₹8L - ₹13L / yr
skill iconPython
RESTful APIs
SQL
JIRA

Requirements:

  • Must have proficiency in Python
  • At least 3+ years of professional experience in software application development.
  • Good understanding of REST APIs and a solid experience in testing APIs.
  • Should have built APIs at some point and practical knowledge on working with them
  • Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
  • Ability to develop applications for test automation
  • Should have worked in a distributed micro-service environment
  • Hands-on experience with Python packages for testing (preferably pytest).
  • Should be able to create fixtures, mock objects and datasets that can be used by tests across different micro-services
  • Proficiency in gitStrong in writing SQL queriesTools like Jira, Asana or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool
  • Excellent written and oral communication and organisational skills with the ability to work within a growing company with increasing needs
  • Proven track record of ability to handle time-critical projects


Good to have:

  • Good understanding of CI/CDKnowledge of queues, especially Kafka
  • Ability to independently manage test environment deployments and handle issues around itPerformed load testing of API endpoints
  • Should have built an API test automation framework from scratch and maintained it
  • Knowledge of cloud platforms like AWS, Azure
  • Knowledge of different browsers and cross-platform operating systems
  • Knowledge of JavaScript
  • Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
  • Should have knowlege in API Creation, Coding Experience would be add on.
  • 5+ years experience in test automation using tools like TestNG, Selenium Webdriver (Grid, parallel, SauceLabs), Mocha_Chai front-end and backend test automation
  • Bachelor's degree in Computer Science / IT / Computer Applications


Read more
Kreditventure

Kreditventure

Agency job
via Pluginlive by Harsha Saggi
Mumbai
7 - 9 yrs
₹20L - ₹25L / yr
Fullstack Developer
skill iconJava
skill iconPython
MERN Stack
SaaS
+4 more

Company: Kredit Venture

About the company:

KreditVenture is seeking a Technical Product Manager to lead the development, strategy, and

execution of our SaaS applications built on top of Loan Origination Systems and Lending Platforms.

This role requires a strong technical background, a product ownership mindset, and the ability to

drive execution through both in-house teams and outsourced vendors. The ideal candidate will play

a key role in aligning business goals with technical implementation, ensuring a scalable, secure,

and user-centric platform.

Job Description

Job Title: Senior Manager / AVP / DVP – Technical Product Manager


Location: Mumbai (Ghatkopar West)


Compensation: Upto 25 LPA


Experience: 7-8 years (Designation will be based on experience)


Qualification: 

- Bachelor’s degree in Computer Science, Engineering, or a related field.

- An MBA is a plus.


 Roles and Responsibilities


Technology Leadership:


  • Lead SaaS Platform Development – Strong expertise in full-stack development (Java, Python, MERN stack) and cloud-based architectures.
  • API & Workflow Design – Drive microservices-based REST API development and implement business process automation.
  • Third-Party Integrations – Enable seamless API integrations with external service providers.
  • Code Quality & Best Practices – Ensure code quality, security, and performance optimization through structured audits.


Vendor & Delivery Management:


  • Outsourced Vendor Oversight – Manage and collaborate with external development partners, ensuring high-quality and timely delivery.
  • Delivery Governance – Define SLAs, monitor vendor performance, and proactively escalate risks.
  • Quality Assurance – Ensure vendor deliverables align with product standards and integrate smoothly with internal development.


Collaboration & Stakeholder Engagement:


  • Customer Insights & Feedback – Conduct user research and feedback sessions to enhance platform capabilities.
  • Product Demos & GTM Support – Showcase platform features to potential clients and support sales & business development initiatives.


Platform Development & Compliance:


  • Component Libraries & Workflow Automation – Develop reusable UI components and enable no-code/low-code business workflows.
  • Security & Compliance – Ensure adherence to data protection, authentication, and regulatory standards (e.g., GDPR, PCI-DSS).
  • Performance Monitoring & Analytics – Define KPIs and drive continuous performance optimization.
Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
3 - 6 yrs
₹8L - ₹13L / yr
skill iconAmazon Web Services (AWS)
Terraform
Ansible
skill iconDocker
Apache Kafka
+6 more

Must be:

  • Based in Mumbai
  • Comfortable with Work from Office
  • Available to join immediately


Responsibilities:

  • Manage, monitor, and scale production systems across cloud (AWS/GCP) and on-prem.
  • Work with Kubernetes, Docker, Lambdas to build reliable, scalable infrastructure.
  • Build tools and automation using Python, Go, or relevant scripting languages.
  • Ensure system observability using tools like NewRelic, Prometheus, Grafana, CloudWatch, PagerDuty.
  • Optimize for performance and low-latency in real-time systems using Kafka, gRPC, RTP.
  • Use Terraform, CloudFormation, Ansible, Chef, Puppet for infra automation and orchestration.
  • Load testing using Gatling, JMeter, and ensuring fault tolerance and high availability.
  • Collaborate with dev teams and participate in on-call rotations.


Requirements:

  • B.E./B.Tech in CS, Engineering or equivalent experience.
  • 3+ years in production infra and cloud-based systems.
  • Strong background in Linux (RHEL/CentOS) and shell scripting.
  • Experience managing hybrid infrastructure (cloud + on-prem).
  • Strong testing practices and code quality focus.
  • Experience leading teams is a plus.
Read more
Kanjurmarg, Mumbai
1 - 2 yrs
₹3L - ₹4L / yr
Embedded C
Raspberry Pi
skill iconPython
UART
3D modeling
+5 more

Roles and Responsibilities:

* Strong experience with programming microcontrollers like Arduino, ESP32, and ESP8266.

* Experience with Embedded C/C++.

* Experience with Raspberry Pi, Python, and OpenCV.

* Experience with Low power Devices would be preferred

* Knowledge about communication protocols (UART, I2C, etc.)

* Experience with Wi-Fi, LoRa, GSM, M2M, SImcom, and Quactel Modules.

* Experience with 3d modeling (preferred).

* Experience with 3d printers (preferred).

* Experience with Hardware design and knowledge of basic electronics.

* Experience with Software will be preferred.ss

Detailed Job role (daily basis) done by the IOT developer.


* Design hardware that meets the needs of the application.

* Support for current hardware, testing, and bug-fixing.

* Create, maintain, and document microcontroller code.

* prototyping, testing, and soldering

* Making 3D/CAD models for PCBs.

Read more
Daten  Wissen Pvt Ltd

at Daten Wissen Pvt Ltd

1 recruiter
Ashwini poojari
Posted by Ashwini poojari
Mumbai
1.5 - 2.5 yrs
₹3L - ₹7L / yr
Computer Vision
Image Processing
skill iconDeep Learning
skill iconC++
skill iconPython
+1 more

Artificial Intelligence Researcher


Job description 


This is a full-time on-site role for an Artificial Intelligence Researcher at Daten & Wissen in Mumbai. The researcher will be responsible for conducting cutting-edge research in areas such as Computer Vision, Natural Language Processing, Deep Learning, and Time Series Predictions. The role involves collaborating with industry partners, developing AI solutions, and contributing to the advancement of AI technologies.


Key Responsibilities:

  • Design, develop, and implement computer vision algorithms for object detection, tracking, recognition, segmentation, and activity analysis.
  • Train and fine-tune deep learning models (CNNs, RNNs, Transformers, etc.) for various video and image-based tasks.
  • Work with large-scale datasets and annotated video data to enhance model accuracy and robustness.
  • Optimize and deploy models to run efficiently on edge devices, cloud environments, and GPUs.
  • Collaborate with cross-functional teams including data scientists, backend engineers, and UI/UX designers.
  • Continuously explore new research, tools, and technologies to enhance our product capabilities.
  • Perform model evaluation, testing, and benchmarking for accuracy, speed, and reliability.


Required Skills:

  • Proficiency in Python and C++.
  • Experience with object detection models like YOLO, SSD, Faster R-CNN.
  • Strong understanding of classical computer vision techniques (OpenCV, image processing, etc.).
  • Expertise in Machine Learning, Pattern Recognition, and Statistics.
  • Experience with frameworks like TensorFlow, PyTorch, MXNet.
  • Strong understanding of Deep Learning and Video Analytics.
  • Experience with CUDA, Docker, Nvidia NGC Containers, and cloud platforms (AWS, Azure, GCP).
  • Familiar with Kubernetes, Kafka, and model optimization for Nvidia hardware (e.g., TensorRT).


Qualifications

  • 2+ years of hands-on experience in computer vision and deep learning.
  • Computer Science and Data Science skills
  • Expertise in Pattern Recognition
  • Strong background in Research and Statistics
  • Proficiency in Machine Learning algorithms
  • Experience with AI frameworks such as TensorFlow or PyTorch
  • Excellent problem-solving and analytical skills

Location                    : Mumbai (Bhayandar) 



Read more
OIP Insurtech

at OIP Insurtech

2 candid answers
Katarina Vasic
Posted by Katarina Vasic
Remote only
4 - 12 yrs
₹30L - ₹50L / yr
skill iconPython
Natural Language Processing (NLP)
Data extraction
OCR
Computer Vision
+3 more

We’re looking for a skilled Senior Machine Learning Engineer to help us transform the Insurtech space. You’ll build intelligent agents and models that read, reason, and act.


Insurance ops are broken. Underwriters drown in PDFs. Risk clearance is chaos. Emails go in circles. We’ve lived it – and we’re fixing it. Bound AI is building agentic AI workflows that go beyond chat. We orchestrate intelligent agents to handle policy operations end-to-end:


• Risk clearance.

• SOV ingestion.

• Loss run summarization.

• Policy issuance.

• Risk triage.


No hallucinations. No handwaving. Just real-world AI that executes – in production, at scale.


Join us to help shape the future of insurance through advanced technology!


We’re Looking For:


  • Deep experience in GenAI, LLM fine-tuning, and multi-agent orchestration (LangChain, DSPy, or similar).
  • 5+ years of proven experience in the field
  • Strong ML/AI engineering background in both foundational modeling (NLP, transformers, RAG) and traditional ML.
  • Solid Python engineering chops – you write production-ready code, not just notebooks.
  • A startup mindset – curiosity, speed, and obsession with shipping things that matter.
  • Bonus – Experience with insurance or document intelligence (SOVs, Loss Runs, ACORDs).


What You’ll Be Doing:


  • Develop foundation-model-based pipelines to read and understand insurance documents.
  • Develop GenAI agents that handle real-time decision-making and workflow orchestration, and modular, composable agent architectures that interact with humans, APIs, and other agents.
  • Work on auto-adaptive workflows that optimize around data quality, context, and risk signals.



Read more
webcyper pvt ltd
Amol Surve
Posted by Amol Surve
Mumbai
0 - 1 yrs
₹2L - ₹3L / yr
skill iconPython
skill iconDjango
skill iconReact.js

At Webcyper Pvt Ltd, we are a growing technology company building innovative web solutions for our clients. We focus on delivering high-quality digital products, and we’re on a mission to scale our operations with talented, passionate individuals.


If you're a problem solver, love clean code, and are excited to work in a fast-paced startup environment — we want to hear from you!



Key Responsibilities:


Develop, test, and maintain high-quality web applications using Python and Django framework.

Work closely with frontend developers and designers to implement user-friendly interfaces.

Integrate third-party APIs and services.

Write clean, reusable, and efficient code.

Optimize applications for speed and scalability.

Troubleshoot, debug, and upgrade existing applications.

Participate in code reviews and technical discussions.

Stay up-to-date with emerging trends and technologies in backend development.

Read more
OMP India
Srishti Soni
Posted by Srishti Soni
Mumbai
6 - 12 yrs
₹15L - ₹25L / yr
skill iconKubernetes
skill iconDocker
Microsoft Windows Azure
Terraform
Ansible
+1 more

Your challenge

As a DevOps Engineer, you’re responsible for automating the deployment of our software solutions. You interact with software engineers, functional product managers, and ICT professionals daily. Using your technical skills, you provide internal tooling for development and QA teams around the globe.

We believe in an integrated approach, where every team member is involved in all steps of the software development life cycle: analysis, architectural design, programming, and maintenance. We expect you to be the proud owner of your work and take responsibility for it.

Together with a tight-knit group of 5-6 team players, you develop, maintain and support key elements of our infrastructure:

  • Continuous integration and production systems
  • Release and build management
  • Package management
  • Containerization and orchestration

 

Your team

As our new DevOps Engineer, you’ll be part of a large, fast-growing, international team located in Belgium (Antwerp, Ghent, Wavre), Spain (Barcelona), Ukraine (Lviv), and the US (Atlanta). Software Development creates leading software solutions that make a difference to our customers. We make smart, robust, and scalable software to solve complex supply chain planning challenges.

Your profile

We are looking for someone who meets the following qualifications:

  • A bachelor’s or master’s degree in a field related to Computer Science.
  • Pride in developing high-quality solutions and taking responsibility for their maintenance.
  • Minimum 6 years' experience in a similar role
  • Good knowledge of the following technologies: Kubernetes, PowerShell or bash scripting, Jenkins, Azure Pipelines or similar automation systems, Git.
  • Familiarity with the Cloud–Native Landscape. Terraform, Ansible, and Helm are tools we use daily.
  • Supportive towards users.


Bonus points if you have:

  • A background in DevOps, ICT, or technical support.
  • Customer support experience or other relevant work experience, including internships.
  • Understanding of Windows networks and Active Directory.
  • Experience with transferring applications into the cloud.
  • Programming skills.


Soft skills

Team Work 

Pragmatic attitude

Passionate

Analytical thinker

Tech Savvy

Fast Learner


Hard skills

Kubernetes 

CI/CD

Git 

Powershell


Your future

At OMP, we’re eager to find your best career fit. Our talent management program supports your personal development and empowers you to build a career in line with your ambitions.


Many of our team members who start as DevOps Engineers grow into roles in DevOps/Cloud architecture, project management, or people management.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort