Cutshort logo
Python Jobs in Bangalore (Bengaluru)

50+ Python Jobs in Bangalore (Bengaluru) | Python Job openings in Bangalore (Bengaluru)

Apply to 50+ Python Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
VDart

VDart

Agency job
via VDart by Don Blessing
Hyderabad, Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹12L / yr
skill iconPython
RESTful APIs
Software troubleshooting

Role: Python Developer

Location: HYD , BLR,

Experience : 6+ Years

 

Skills needed:

Python developer experienced 5+ Years in designing, developing, and maintaining scalable applications with a strong focus on API integration. Must demonstrate proficiency in RESTful API consumption, third-party service integration, and troubleshooting API-related issues.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Chetna Jain
Posted by Chetna Jain
Bengaluru (Bangalore), Pune, Mumbai, Chennai
2 - 6 yrs
Best in industry
Robotic process automation (RPA),
Automation Anywhere
skill iconPython
Copilot

We are looking for a skilled Automation Anywhere Engineer with a strong background in RPA development, Python scripting, and experience with CoPilot integrations. The ideal candidate will play a key role in designing, developing, and implementing automation solutions to streamline business processes and improve operational efficiency.


Required Skills:

  • 2–6 years of hands-on experience in Automation Anywhere (A2019 or higher).
  • Strong programming skills in Python for automation and integration.
  • Good understanding of RPA concepts, lifecycle, and best practices.
  • Experience working with CoPilot (Microsoft Power Platform/AI CoPilot or equivalent).
  • Knowledge of API integration and web services (REST/SOAP).
  • Familiarity with process analysis and design techniques.
  • Ability to write clean, reusable, and well-documented code.
  • Strong problem-solving and communication skills.


Read more
Pace Wisdom Solutions
Bengaluru (Bangalore), Mangalore
2 - 6 yrs
₹6L - ₹18L / yr
skill iconPython
skill iconDjango
skill iconFlask
FastAPI
RESTful APIs

Location: Bangalore/ Mangalore

Experience required: 2-6 years.

Key skills: Python, Django, Flask, FastAPI


We are seeking a skilled Python Developer with 2–6 years of experience who can contribute as an individual performer while also supporting technical decision-making and mentoring junior developers. The role involves designing and building scalable backend systems using Django/Flask, FastAPI, and collaborating closely with cross-functional teams to deliver high-quality software solutions.


Responsibilities:

• Develop robust, scalable, and efficient backend applications using Python (Django/Flask, FastAPI).

• Build and maintain RESTful APIs that are secure, performant, and easy to integrate.

• Collaborate with cross-functional teams to deliver seamless and impactful software solutions.

• Participate actively in all phases of the software development life cycle: requirements gathering, design, development, testing, deployment, and maintenance.

• Write clean, maintainable, and well-documented code that meets industry best practices.

• Troubleshoot, debug, and optimize existing systems for performance and scalability.

• Contribute ideas for continuous improvement in development processes and team culture.


Requirements:

• 2–6 years of hands-on development experience in Python, with proficiency in frameworks like Django/Flask, FastAPI.

• Strong understanding of OOP concepts, design principles, and design patterns.

• Solid experience working with databases.

• Good knowledge of designing and consuming RESTful APIs.

• Comfortable working with version control systems like Git and collaborating in code reviews.

• Exposure to cloud platforms (AWS, Azure, or GCP) is an added advantage.

• Familiarity with Docker and containerized application development is a plus.

• Understanding of CI/CD pipelines is desirable.

• Analytical mindset with strong problem-solving skills.


About the Company:

Pace Wisdom Solutions is a deep-tech Product engineering and consulting firm. We have offices in San Francisco, Bengaluru, and Singapore. We specialize in designing and developing bespoke software solutions that cater to solving niche business problems.


We engage with our clients at various stages:

• Right from the idea stage to scope out business requirements.

• Design & architect the right solution and define tangible milestones.

• Set up dedicated and on-demand tech teams for agile delivery.

• Take accountability for successful deployments to ensure efficient go-to-market Implementations.


Pace Wisdom has been working with Fortune 500 Enterprises and growth-stage startups/SMEs since 2012. We also work as an extended Tech team and at times we have played the role of a virtual CTO too. We believe in building lasting relationships and providing value-add every time and going beyond business

Read more
Deqode

at Deqode

1 recruiter
Sneha Jain
Posted by Sneha Jain
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
3.5 - 9 yrs
₹3L - ₹13L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
AWS Lambda
skill iconDjango
Amazon S3

Job Summary:

We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.

Key Responsibilities:

  • Develop and maintain backend applications using Python and frameworks like Django or Flask
  • Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
  • Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
  • Write clean, efficient, and testable code following best practices
  • Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
  • Monitor and optimize system performance and troubleshoot production issues
  • Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
  • Maintain and improve application security and compliance with industry standards

Required Skills:

  • Strong programming skills in Python
  • Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
  • Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
  • Good understanding of RESTful API design and microservices architecture
  • Hands-on experience with CI/CD, Git, and version control systems
  • Familiarity with containerization (Docker, ECS, or EKS) is a plus
  • Strong problem-solving and communication skills

Preferred Qualifications:

  • Experience with PySpark, Pandas, or data engineering tools
  • Working knowledge of Django, Flask, or other Python frameworks
  • AWS Certification (e.g., AWS Certified Developer – Associate) is a plus

Educational Qualification:

  • Bachelor's or Master’s degree in Computer Science, Engineering, or related field


Read more
AI Startup company

AI Startup company

Agency job
via People Impact by Ranjita Shrivastava
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹15L / yr
yolov7
skill iconPython
Computer Vision
OpenCV
TensorFlow

Role Overview

We are seeking a highly skilled and experienced Senior AI Engineer with deep expertise in computer vision and architectural design. The ideal candidate will lead the development of robust, scalable AI systems, drive architectural decisions, and contribute significantly to the deployment of real-time video analytics, multi-model systems, and intelligent automation

solutions.


Key Responsibilities

 Design and lead the architecture of complex AI systems in the domain of computer vision and real-time inference.

 Build and deploy models for object detection, image segmentation, classification, and tracking.

 Mentor and guide junior engineers on deep learning best practices and scalable software engineering.

 Drive end-to-end ML pipelines: from data ingestion and augmentation to training, deployment, and monitoring.

 Work with YOLO-based and transformer-based models for industrial use-cases.

 Lead integration of AI systems into production with hardware, backend, and DevOps teams.

 Develop automated benchmarking, annotation, and evaluation tools.

 Ensure maintainability, scalability, and reproducibility of models through version control, CI/CD, and containerization.


Required Skills

 Advanced proficiency in Python and deep learning frameworks (PyTorch, TensorFlow).

 Strong experience with YOLO, segmentation networks (UNet, Mask R-CNN), and

tracking (Deep SORT).

 Sound understanding of real-time video analytics and inference optimization.

 Hands-on experience designing model pipelines using Docker, Git, MLflow, or similar tools.

 Familiarity with OpenCV, NumPy, and image processing techniques.

 Proficiency in deploying models on Linux systems with GPU or edge devices (Jetson, Coral)


Good to Have

 Experience with multi-model orchestration, streaming inference (DeepStream), or virtual camera inputs.

 Exposure to production-level MLOps practices.

 Knowledge of cloud-based deployment on AWS, GCP, or DigitalOcean.

 Familiarity with synthetic data generation, augmentation libraries, and 3D modeling tools.

 Publications, patents, or open-source contributions in the AI/ML space.

Qualifications

 B.E./B.Tech/M.Tech in Computer Science, Electrical Engineering, or related field.

 4+ years of proven experience in AI/ML with a focus on computer vision and system- level design.

 Strong portfolio or demonstrable projects in production environments

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹10L - ₹25L / yr
Microsoft Windows Azure
Data engineering
skill iconPython
Apache Kafka

Role Overview:

We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.

The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.

Key Responsibilities:

  • Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
  • Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
  • Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
  • Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
  • Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
  • Mentor junior engineers, perform code reviews, and promote engineering best practices.
  • Stay current with evolving technologies in cloud, big data, and healthcare data standards.
  • Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).

Required Skills & Qualifications:

  • 4+ years of hands-on experience in data engineering roles.
  • Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
  • Proficient in Python for data processing and automation.
  • Experience with Azure Databricks (or readiness to ramp up quickly).
  • Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
  • Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
  • Familiarity with containerization tools like Docker and orchestration using Kubernetes.
  • Exposure to CI/CD pipelines for data applications.
  • Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
  • Excellent problem-solving abilities and a proactive mindset.
  • Strong communication and interpersonal skills to work in cross-functional teams.


Read more
Fisdom

at Fisdom

1 recruiter
Subash P
Posted by Subash P
Bengaluru (Bangalore)
2 - 6 yrs
₹1L - ₹1L / yr
skill iconPython
SQL

Background


Fisdom is a leading digital wealth management platform. Fisdom platform (mobile apps and web apps) provides access to consumers to a wide bouquet of financial solutions – investments, savings and protection (and many more in the pipeline). Fisdom blends cutting-edge technology with conventional financial wisdom, awesome UX and friendly customer service to make financial products simpler and more accessible to millions of Indians. We are growing and constantly looking for high performers to participate in our growth story. We have recently been certified as the great place to work. For more info, visit www.fisdom.com.


Objectives of this Role


Improve, execute, and effectively communicate significant analyses that identify opportunities across the business

Participate in meetings with management, assessing and addressing issues to identify and implement toward operations

Provide strong and timely financial and business analytic decision support to various organizational stakeholders


Responsibilities


Interpret data, analyze results using analytics, research methodologies, and statistical techniques

Develop and implement data analyses, leverage data collection and other strategies that statistical efficiency and quality

Prepare summarize various weekly, monthly, and periodic results for use by various key stakeholders

Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution

Evaluate key performance indicators, provide ongoing reports, and recommend business plan updates


Skills and Qualification


Bachelor’s degree, preferably in computer science, mathematics, or economics

Advanced analytical skills with experience collecting, organizing, analyzing, and disseminating information with accuracy

The ability to present findings in a polished way

Proficiency with statistics and dataset analytics (using SQL, Python, Excel)

Entrepreneurial mindset, with an innovative approach to business planning

Relevant Industry Experience of more than 2 - 6years, more than 4yrs python experience is must 


Preferable- Product startup's, fintech 


Why join us and where?


We have an excellent work culture and an opportunity to be a part of a growing organization with immense learning possibilities. You have an opportunity to build a brand from the scratch. All of this, along with top of the line remuneration and challenging work. You will based out of Bangalore.

Read more
Vahan.ai

at Vahan.ai

4 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
3yrs+
Upto ₹28L / yr (Varies
)
DevOps
skill iconAmazon Web Services (AWS)
CI/CD
Linux/Unix
Linux administration
+15 more

About Us:

At Vahan, we are building the first AI powered recruitment marketplace for India’s 300 million strong Blue Collar workforce, opening doors to economic opportunities and brighter futures.

Already India’s largest recruitment platform, Vahan is supported by marquee investors like Khosla Ventures, Y Combinator, Airtel, Vijay Shekhar Sharma (CEO, Paytm), and leading executives from Google and Facebook.

Our customers include names like Swiggy, Zomato, Rapido, Zepto, and many more. We leverage cutting-edge technology and AI to recruit for the workforces of some of the most recognized companies in the country.

Our vision is ambitious: to become the go-to platform for blue-collar professionals worldwide, empowering them with not just earning opportunities but also the tools, benefits, and support they need to thrive. We aim to impact over a billion lives worldwide, creating a future where everyone has access to economic prosperity.


If our vision excites you, Vahan might just be your next adventure. We’re on the hunt for driven individuals who love tackling big challenges. If this sounds like your kind of journey, dive into the details and see where you can make your mark.


What You Will Be Doing:

  • Build & Automate Cloud Infrastructure – Design, deploy, and optimize cloud environments, ensuring scalability, reliability, and cost efficiency.
  • Set Up CI/CD & Deployment Pipelines – Develop automated workflows to streamline code integration, testing, and deployment for faster releases.
  • Monitor & Improve System Performance – Implement robust monitoring, logging, and alerting mechanisms to proactively identify and resolve issues.
  • Manage Containers & Scalability – Deploy and maintain containerized applications, ensuring efficient resource utilization and high availability.
  • Ensure Security & Reliability – Enforce access controls, backup strategies, and disaster recovery plans to safeguard infrastructure and data.
  • Adapt & Scale with the Startup Take on dynamic responsibilities, quickly learn new technologies, and evolve processes to meet growing business needs.


You Will Thrive in This Role If You:

Must Haves:

  • Experience: 3+ years in DevOps or related roles, focusing on cloud environments, automation, CI/CD, and Linux system administration. Strong expertise in debugging and infrastructure performance improvements.
  • Cloud Expertise: In-depth experience with one or more cloud platforms (AWS, GCP), including services like EC2, RDS, S3, VPC, etc.
  • IaC Tools: Proficiency in Terraform, Ansible, CloudFormation, or similar tools.
  • Scripting Skills: Strong scripting abilities in Python, Bash, or PowerShell.
  • Containerization: Experience with Docker, including managing containers in production.
  • Monitoring Tools: Hands-on experience with tools like ELK, Prometheus, Grafana, CloudWatch, New Relic, and Data dog.
  • Version Control: Proficiency with Git and code repository management.
  • Soft Skills: Excellent problem-solving skills, attention to detail, and effective communication with both technical and non-technical team members.
  • Database Management: Experience with managing and tuning databases like MySQL and PostgreSQL.
  • Deployment Pipelines: Experience with Jenkins and similar CI/CD tools.
  • Message Queues: Experience with rabbitMQ/SQS/Kafka.

Nice to Have:

  • Certifications: AWS Certified DevOps Engineer, Certified Kubernetes Administrator (CKA), or similar.
  • SRE Practices: Familiarity with Site Reliability Engineering (SRE) principles, including error budgeting and service level objectives (SLOs).
  • Serverless Computing: Knowledge of AWS Lambda, Azure Functions, or similar architectures.
  • Containerization: Experience with Docker and Kubernetes, including managing production clusters.
  • Security: Awareness of security best practices and implementations.
  • Cloud Cost Optimization: Experience with cost-saving initiatives in cloud environments.
  • Data Pipelines & ETL: Experience in setting up and managing data pipelines and ETL workflows.
  • Familiarity with Modern Tech Stacks: Exposure to Python, Node.js, React.js, and Kotlin for app deployment CI/CD pipelines.
  • MLOps Pipelines: Understanding of ML model deployment and operationalization.
  • Data Retrieval & Snapshots: Experience with PITI, EC2, and RDS snapshots.
  • System Resiliency & Recovery: Strategies for ensuring system reliability and recovery in case of downtime.


At Vahan, you’ll have the opportunity to make a real impact in a sector that touches millions of lives. We’re committed to not only advancing the livelihoods of our workforce but also, in taking care of the people who make this mission possible. Here’s what we offer:

  • Unlimited PTO: Trust and flexibility to manage your time in the way that works best for you.
  • Comprehensive Medical Insurance: We’ve got you covered with plans designed to support you and your loved ones.
  • Monthly Wellness Leaves: Regular time off to recharge and focus on what matters most.
  • Competitive Pay: Your contributions are recognized and rewarded with a compensation package that reflects your impact.


Join us, and be part of something bigger—where your work drives real,

positive change in the world.

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Hyderabad
3 - 8 yrs
₹7L - ₹24L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
Script Writing
SQL

Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc. 2. Deep Learning Frameworks: PyTorch, spaCy, Keras 3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers 4. Experience in working with Image processing, computer vision is must 5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,keras etc.,.) , LangChain, Flask, FastAPI, prompt engineering. 6. Programming experience in Python 7. Strong written and verbal communications 8. Excellent interpersonal and collaboration skills.

Good-to-Have 1. Experience working with vectored databases and graph representation of documents. 2. Experience with building or maintaining MLOps pipelines. 3. Experience in Cloud computing infrastructures like AWS Sagemaker or Azure ML for implementing ML solutions is preferred. 4. Exposure to Docker, Kubernetes

SN Role descriptions / Expectations from the Role

1 Design and implement scalable and efficient data architectures to support generative AI workflows.

2 Fine tune and optimize large language models (LLM) for generative AI, conduct performance evaluation and benchmarking for LLMs and machine learning models

3 Apply prompt engineer techniques as required by the use case

4 Collaborate with research and development teams to build large language models for generative AI use cases, plan and breakdown of larger data science tasks to lower-level tasks

5 Lead junior data engineers on tasks such as design data pipelines, dataset creation, and deployment, use data visualization tools, machine learning techniques, natural language processing , feature engineering, deep learning , statistical modelling as required by the use case.

Read more
Vahan.ai

at Vahan.ai

4 candid answers
1 recruiter
Eman Khan
Posted by Eman Khan
Bengaluru (Bangalore)
2 - 5 yrs
₹15L - ₹28L / yr
skill iconPython
ETL
Extraction
Transformation
Data loading
+15 more

About Us:

At Vahan, we are building India’s first AI powered recruitment marketplace for India’s 300 million strong Blue Collar workforce, opening doors to economic opportunities and brighter futures. Already India’s largest recruitment platform, Vahan is supported by marquee investors like Khosla Ventures, Bharti Airtel, Vijay Shekhar Sharma (CEO, Paytm), and leading executives from Google and Facebook. Our customers include names like Swiggy, Zomato, Rapido, Zepto, and many more. We leverage cutting-edge technology and AI to recruit for the workforces of some of the most recognized companies in the country.


Our vision is ambitious: to become the go-to platform for blue-collar professionals worldwide, empowering them with not just earning opportunities but also the tools, benefits, and support they need to thrive. We aim to impact over a billion lives worldwide, creating a future where everyone has access to economic prosperity. If our vision excites you, Vahan might just be your next adventure. We’re on the hunt for driven individuals who love tackling big challenges. If this sounds like your kind of journey, dive into the details and see where you can make your mark.


What you will be doing:

  • Architect and Implement Data Infrastructure: Design, build, and maintain robust and scalable data pipelines and a data warehouse/lake solution using open-source and cloud-based technologies, optimized for both high-frequency small file and large file data ingestion, and real-time data streams. This includes implementing efficient mechanisms for handling high volumes of data arriving at frequent intervals.
  • Develop and Optimize Data Processes: Create custom tools, primarily using Python, for data validation, processing, analysis, and automation. Continuously improve ETL/ELT processes for efficiency, reliability, and scalability. This includes building processes to bridge gaps between different databases and data sources, ensuring data consistency and accessibility. This also includes processing and integrating data from streaming sources.
  • Lead and Mentor: Collaborate with product, engineering, and business teams to understand data requirements and provide data-driven solutions. Mentor and guide junior data engineers (as the team grows) and foster a culture of data excellence.
  • Data Quality and Governance: Proactively identify and address data quality issues. Implement and maintain robust data quality monitoring, alerting, and measurement systems to ensure the accuracy, completeness, and consistency of our data assets. Implement and enforce data governance and security best practices, taking proactive ownership.
  • Research: Research and adapt newer technologies to suit the requirements.


You will thrive in this role if you:

  • Are a Hands-On Technical Leader: You possess deep technical expertise in data engineering and are comfortable leading by example, diving into code, and setting technical direction.
  • Are a Startup-Minded Problem Solver: You thrive in a fast-paced, dynamic environment, are comfortable with ambiguity, and are eager to build from the ground up. You proactively identify and address challenges.
  • Are a Collaborative Communicator: You can effectively communicate complex technical concepts to both technical and non-technical audiences and build strong relationships with stakeholders.
  • Are a Strategic Thinker: You can think ahead and architect long lasting systems.


At Vahan, you’ll have the opportunity to make a real impact in a sector that touches millions of lives. We’re committed to not only advancing the livelihoods of our workforce but also in taking care of the people who make this mission possible. Here’s what we offer:

  • Unlimited PTO: Trust and flexibility to manage your time in the way that works best for you.
  • Comprehensive Medical Insurance: We’ve got you covered with plans designed to support you and your loved ones.
  • Monthly Wellness Leaves: Regular time off to recharge and focus on what matters most.
  • Competitive Pay: Your contributions are recognized and rewarded with a compensation package that reflects your impact.


Join us, and be part of something bigger—where your work drives real, positive change in the world.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sonali RajeshKumar
Posted by Sonali RajeshKumar
Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
databricks
skill iconPython
SQL
"Azure Datalakes"

JOB REQUIREMENT:

 

Wissen Technology is now hiring a Azure Data Engineer with 7+ years of relevant experience.

 We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team, which has made a mark as a high-end technical consultant.



Required Skills:

· 6+ years of being a practitioner in data engineering or a related field.

· Proficiency in programming skills in Python

· Experience with data processing frameworks like Apache Spark or Hadoop.

· Experience working on Snowflake and Databricks.

· Familiarity with cloud platforms (AWS, Azure) and their data services.

· Experience with data warehousing concepts and technologies.

· Experience with message queues and streaming platforms (e.g., Kafka).

· Excellent communication and collaboration skills.

· Ability to work independently and as part of a geographically distributed team.

Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
5 - 10 yrs
₹5L - ₹14L / yr
MySQL
skill iconMongoDB
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconDjango
+10 more

Job Title: Full Stack Developer (MERN + Python)

Location: Bangalore

Job Type: Full-time

Experience: 4–8 years

 

About Miror

Miror is a pioneering FemTech company transforming how midlife women navigate perimenopause and menopause. We offer medically-backed solutions, expert consultations, community engagement, and wellness products to empower women in their health journey. Join us to make a meaningful difference through technology.

 

Role Overview

·        We are seeking a passionate and experienced Full Stack Developer skilled in MERN stack and Python (Django/Flask) to build and scale high-impact features across our web and mobile platforms. You will collaborate with cross-functional teams to deliver seamless user experiences and robust backend systems.

 

Key Responsibilities

·        Design, develop, and maintain scalable web applications using MySQL/Postgres, MongoDB, Express.js, React.js, and Node.js

·        Build and manage RESTful APIs and microservices using Python (Django/Flask/FastAPI)

·        Integrate with third-party platforms like OpenAI, WhatsApp APIs (Whapi), Interakt, and Zoho

·        Optimize performance across the frontend and backend

·        Collaborate with product managers, designers, and other developers to deliver high-quality features

·        Ensure security, scalability, and maintainability of code

·        Write clean, reusable, and well-documented code

·        Contribute to DevOps, CI/CD, and server deployment workflows (AWS/Lightsail)

·        Participate in code reviews and mentor junior developers if needed

 

Required Skills

·        Strong experience with MERN Stack: MongoDB, Express.js, React.js, Node.js

·        Proficiency in Python and web frameworks like Django, Flask, or FastAPI

·        Experience working with REST APIs, JWT/Auth, and WebSockets

·        Good understanding of frontend design systems, state management (Redux/Context), and responsive UI

·        Familiarity with database design and queries (MongoDB, PostgreSQL/MySQL)

·        Experience with Git, Docker, and deployment pipelines

·        Comfortable working in Linux-based environments (e.g., Ubuntu on AWS)

 

Bonus Skills

·        Experience with AI integrations (e.g., OpenAI, LangChain)

·        Familiarity with WooCommerce, WordPress APIs

·        Experience in chatbot development or WhatsApp API integration

 

Who You Are

·        You are a problem-solver with a product-first mindset

·        You care about user experience and performance

·        You enjoy working in a fast-paced, collaborative environment

·        You have a growth mindset and are open to learning new technologies

 

Why Join Us?

·        Work at the intersection of healthcare, community, and technology

·        Directly impact the lives of women across India and beyond

·        Flexible work environment and collaborative team

·        Opportunity to grow with a purpose-driven startup

·         


In you are interested please apply here and drop me a message here in cutshort.

Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Bengaluru (Bangalore)
3 - 8 yrs
₹17L - ₹25L / yr
PySpark
Windows Azure
Microsoft Windows Azure
skill iconAmazon Web Services (AWS)
SQL
+3 more

Employment type- Contract basis


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
  • Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
  • Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
  • Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
  • Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
  • Maintain documentation and implement best practices for data architecture, governance, and security.

⚙️ Required Skills

  • Programming: Proficient in PySpark, Python, and SQL.
  • Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
  • Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
  • Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
  • Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
  • CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.

🧰 Preferred Qualifications

  • Bachelor's or Master's in Computer Science, Engineering, or related field.
  • Certifications in Azure/AWS are highly desirable.
  • Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.



Read more
Supply Wisdom

at Supply Wisdom

4 recruiters
Vaishnavi Bhate
Posted by Vaishnavi Bhate
Bengaluru (Bangalore)
4 - 10 yrs
₹18L - ₹23L / yr
skill iconPython
skill iconDjango
RESTful APIs
skill iconPostgreSQL
MySQL
+12 more

Supply Wisdom: Full Stack Developer 

 

Location: Hybrid Position based in Bangalore 

Reporting to: Tech Lead Manager 

 

   

Supply Wisdom is a global leader in transformative risk intelligence, offering real-time insights to drive business growth, reduce costs, enhance security and compliance, and identify revenue opportunities. Our AI-based SaaS products cover various risk domains, including financial, cyber, operational, ESG, and compliance. With a diverse workforce that is 57% female, our clients include Fortune 100 and Global 2000 firms in sectors like financial services, insurance, healthcare, and technology. 

  

Objective: We are seeking a skilled Full Stack Developer to design and build scalable software solutions. You will be part of a cross-functional team responsible for the full software development life cycle, from conception to deployment. 

As a Full Stack Developer, you should be proficient in both front-end and back-end technologies, development frameworks, and third-party libraries. We’re looking for a team player with strong problem-solving abilities, attention to visual design, and a focus on utility. Familiarity with Agile methodologies, including Scrum and Kanban, is essential. 

 

Responsibilities 

  

  • Collaborate with the development team and product manager to ideate software solutions. 
  • Write effective and secure REST APIs. 
  • Integrate third-party libraries for product enhancement. 
  • Design and implement client-side and server-side architecture. 
  • Work with data scientists and analysts to enhance software using RPA and AI/ML techniques. 
  • Develop and manage well-functioning databases and applications. 
  • Ensure software responsiveness and efficiency through testing. 
  • Troubleshoot, debug, and upgrade software as needed. 
  • Implement security and data protection settings. 
  • Create features and applications with mobile-responsive design. 
  • Write clear, maintainable technical documentation. 
  • Build front-end applications with appealing, responsive visual design. 

 

  

Requirements 

  

  • Degree in Computer Science (or related field) with 4+ years of hands-on experience in Python development, with strong expertise in the Django framework and Django REST Framework (DRF). 
  • Proven experience in designing and building RESTful APIs, with a solid understanding of API versioning, authentication (JWT/OAuth2), and best practices. 
  • Experience with relational databases such as PostgreSQL or MySQL; familiarity with query optimization and database migrations. 
  • Basic front-end development skills using HTML, CSS, and JavaScript; experience with any JavaScript framework (like React or Next Js) is a plus. 
  • Good understanding of Object-Oriented Programming (OOP) and design patterns in Python. 
  • Familiarity with Git and collaborative development workflows (e.g., GitHub, GitLab). 
  • Knowledge of Docker, CI/CD pipelines. 
  • Hands-on experience with AWS services, Nginx web server, RabbitMQ (or similar message brokers), event handling, and synchronization. 
  • Familiarity with Postgres, SSO implementation (desirable), and integration of third-party libraries. 
  • Experience with unit testing, debugging, and code reviews. 
  • Experience using tools like Jira and Confluence. 
  • Ability to work in Agile/Scrum teams with good communication and problem-solving skills. 

 

  

Our Commitment to You: 

 

We offer a competitive salary and generous benefits. In addition, we offer a vibrant work environment, a global team filled with passionate and fun-loving people coming from diverse cultures and backgrounds. 

If you are looking to make an impact in delivering market-leading risk management solutions, empowering our clients, and making the world a better place, then Supply Wisdom is the place for you.

You can learn more at supplywisdom.com and on LinkedIn. 

Read more
HeyCoach
DeepanRaj R
Posted by DeepanRaj R
Bengaluru (Bangalore)
0.6 - 1 yrs
₹12000 - ₹13000 / mo
skill iconData Science
skill iconPython
Jupyter Notebook
Google colab
Predictive modelling
+8 more

Job Title: Data Science Intern

Location: 6th Sector HSR Layout, Bangalore - Work from Office 5.5 Days

Duration: 3 Months | Stipend: Upto ₹12,000 per month 

Post-Internship Offer (PPO): Available based on performance


🧑‍💻 About the Role

We are looking for a passionate and proactive Data Science Intern who is equally excited about mentoring learners and gaining hands-on experience with real-world data operations.

This is a 50% technical + 50% mentorship role that blends classroom support with practical data work. Ideal for those looking to build a career in EdTech and Applied Data Science.


🚀 What You'll Do

Technical Responsibilities (50%)

  • Create and manage dashboards using Python or BI tools like Power BI/Tableau
  • Write and optimize SQL queries to extract and analyze backend data
  • Support in data gathering, cleaning, and basic analysis
  • Contribute to building data pipelines to assist internal decision-making and analytics


🚀Mentorship & Support (50%)

  • Assist instructors during live Data Science sessions
  • Solve doubts related to Python, Machine Learning, and Statistics
  • Create and review quizzes, assignments, and other content
  • Provide one-on-one academic support and mentoring
  • Foster a positive and interactive learning environment


✅ Requirements

  • Bachelor’s degree in Data Science, Computer Science, Statistics, or a related field
  • Strong knowledge of: Python (Data Structures, Functions, OOP, Debugging) Pandas, NumPy, Matplotlib Machine Learning algorithms (scikit-learn) SQL and basic data wrangling APIs, Web Scraping, and Time-Series basics
  • Advanced Excel: Lookup & reference (VLOOKUP, INDEX+MATCH, XLOOKUP, SUMIF), Logical functions (IF, AND, OR), Statistical & Aggregate Functions: (COUNTIFS, STDEV, PERCENTILE), Text cleanup (TRIM, SUBSTITUTE), Time functions (DATEDIF, NETWORKDAYS), Pivot Tables, Power Query, Conditional Formatting, Data Validation, What-If Analysis, and dynamic dashboards using charts & slicers.
  • Excellent communication and interpersonal skills
  • Prior mentoring, teaching, or tutoring experience is a big plus
  • Passion for helping others learn and grow
Read more
HeyCoach
DeepanRaj R
Posted by DeepanRaj R
Bengaluru (Bangalore)
0 - 1 yrs
₹1.3 - ₹1.5 / mo
skill iconData Science
skill iconPython
Excel VBA
EDA
Jupyter Notebook
+10 more

Job Title: Data Science Intern

Location: 6th Sector HSR Layout, Bangalore - Work from Office 5.5 Days

Duration: 3 Months | Stipend: Upto ₹12,000 per month 

Post-Internship Offer (PPO): Available based on performance


🧑‍💻 About the Role

We are looking for a passionate and proactive Data Science Assistant Intern who is equally excited about mentoring learners and gaining hands-on experience with real-world data operations.

This is a 50% technical + 50% mentorship role that blends classroom support with practical data work. Ideal for those looking to build a career in EdTech and Applied Data Science.


🚀 What You'll Do

  • Technical Responsibilities (50%)Create and manage dashboards using Python or BI tools like Power BI/Tableau
  • Write and optimize SQL queries to extract and analyze backend data
  • Support in data gathering, cleaning, and basic analysis
  • Contribute to building data pipelines to assist internal decision-making and analytics

🚀Mentorship & Support (50%)

  • Assist instructors during live Data Science sessions
  • Solve doubts related to Python, Machine Learning, and Statistics
  • Create and review quizzes, assignments, and other content
  • Provide one-on-one academic support and mentoring
  • Foster a positive and interactive learning environment


✅ Requirements

  • Bachelor’s degree in Data Science, Computer Science, Statistics, or a related field
  • Strong knowledge of:
  • Python (Data Structures, Functions, OOP, Debugging)
  • Pandas, NumPy, Matplotlib
  • Machine Learning algorithms (scikit-learn)
  • SQL and basic data wrangling
  • APIs, Web Scraping, and Time-Series basics
  • Advanced Excel: Lookup & reference (VLOOKUP, INDEX+MATCH, XLOOKUP), Logical functions (IF, AND, OR), Statistical & Aggregate Functions: (COUNTIFS, STDEV, PERCENTILE), Text cleanup (TRIM, SUBSTITUTE), Time functions (DATEDIF, NETWORKDAYS), Pivot Tables, Power Query, Conditional Formatting, Data Validation, What-If Analysis, and dynamic dashboards using charts & slicers.
  • Excellent communication and interpersonal skills
  • Prior mentoring, teaching, or tutoring experience is a big plus
  • Passion for helping others learn and grow
Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Bengaluru (Bangalore)
3 - 8 yrs
₹9L - ₹15L / yr
Salesforce development
Oracle Application Express (APEX)
Salesforce Lightning
SQL
ETL
+6 more

1. Software Development Engineer - Salesforce

What we ask for

We are looking for strong engineers to build best in class systems for commercial &

wholesale banking at Bank, using Salesforce service cloud. We seek experienced

developers who bring deep understanding of salesforce development practices, patterns,

anti-patterns, governor limits, sharing & security model that will allow us to architect &

develop robust applications.

You will work closely with business, product teams to build applications which provide end

users with intuitive, clean, minimalist, easy to navigate experience

Develop systems by implementing software development principles and clean code

practices scalable, secure, highly resilient, have low latency

Should be open to work in a start-up environment and have confidence to deal with complex

issues keeping focus on solutions and project objectives as your guiding North Star


Technical Skills:

● Strong hands-on frontend development using JavaScript and LWC

● Expertise in backend development using Apex, Flows, Async Apex

● Understanding of Database concepts: SOQL, SOSL and SQL

● Hands-on experience in API integration using SOAP, REST API, graphql

● Experience with ETL tools , Data migration, and Data governance

● Experience with Apex Design Patterns, Integration Patterns and Apex testing

framework

● Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab,

bitbucket

● Should have worked with at least one programming language - Java, python, c++

and have good understanding of data structures


Preferred qualifications

● Graduate degree in engineering

● Experience developing with India stack

● Experience in fintech or banking domain

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore)
3 - 10 yrs
₹5L - ₹30L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
PyTorch
NumPy
+2 more

Desired Competencies (Technical/Behavioral Competency)

Must-Have 1. Experience in working with various ML libraries and packages like Scikit learn, Numpy, Pandas, Tensorflow, Matplotlib, Caffe, etc.

2. Deep Learning Frameworks: PyTorch, spaCy, Keras

3. Deep Learning Architectures: LSTM, CNN, Self-Attention and Transformers

4. Experience in working with Image processing, computer vision is must

5. Designing data science applications, Large Language Models(LLM) , Generative Pre-trained Transformers (GPT), generative AI techniques, Natural Language Processing (NLP), machine learning techniques, Python, Jupyter Notebook, common data science packages (tensorflow, scikit-learn,kerasetc.,.) , LangChain, Flask,FastAPI, prompt engineering.

6. Programming experience in Python

7. Strong written and verbal communications

8. Excellent interpersonal and collaboration skills.

Good-to-Have 1. Experience working with vectored databases and graph representation of documents.

2. Experience with building or maintaining MLOpspipelines.

3. Experience in Cloud computing infrastructures like AWS Sagemaker or Azure ML for implementing ML solutions is preferred.

4. Exposure to Docker, Kubernetes

Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Noida, Bengaluru (Bangalore), Pune
6 - 9 yrs
₹10L - ₹18L / yr
Windows Azure
SQL Azure
SQL
Data Warehouse (DWH)
skill iconData Analytics
+3 more

Hybrid work mode


(Azure) EDW Experience working in loading Star schema data warehouses using framework

architectures including experience loading type 2 dimensions. Ingesting data from various

sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.

Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),

Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

Read more
Appknox

at Appknox

1 video
6 recruiters
Vasudha Srivastav
Posted by Vasudha Srivastav
Bengaluru (Bangalore)
1 - 4 yrs
Best in industry
SaaS
Cyber Security
Technical support
JIRA
SDK
+12 more

A BIT ABOUT US


Appknox is a leading mobile application security platform that helps enterprises automate security testing across their mobile apps, APIs, and DevSecOps pipelines. Trusted by global banks, fintechs, and government agencies, we enable secure mobile experiences with speed and confidence.



About the Role:

We're looking for a Jr. Technical Support Engineer to join our global support team and provide world-class assistance to customers in the US time zones from 8pm to 5am IST. You will troubleshoot, triage, and resolve technical issues related to Appknox’s mobile app security platform, working closely with Engineering, Product, and Customer Success teams.


Key Responsibilities:

  • Respond to customer issues via email, chat, and voice/voip calls during US business hours.
  • Diagnose, replicate, and resolve issues related to DAST, SAST, and API security modules.
  • Troubleshoot integration issues across CI/CD pipelines, API connections, SDKs, and mobile app builds.
  • Document known issues and solutions in the internal knowledge base and help center.
  • Escalate critical bugs to engineering with full context, reproduction steps, and logs.
  • Guide customers on secure implementation best practices and platform usage.
  • Collaborate with product and QA teams to suggest feature improvements based on customer feedback.
  • Participate in on-call support rotations if needed.



Requirements:

  • 1–4 years of experience in technical support, Delivery or QA roles at a SaaS or cybersecurity company.
  • Excellent communication and documentation skills in English.
  • Comfortable working independently and handling complex technical conversations with customers.
  • Basic understanding of mobile platforms (Android, iOS), REST APIs, Networking Architecture, and security concepts (OWASP, CVEs, etc.).
  • Familiarity with command-line tools, mobile build systems (Gradle/Xcode), and HTTP proxies (Burp).
  • Ability to work full-time within US time zones. Ensure that you have a stable internet connection and work station. 



Good to have skills:

  • Experience working in a product-led cybersecurity company.
  • Knowledge of scripting languages (Python, Bash) or log analysis tools.
  • Familiarity with CI/CD tools (Jenkins, GitHub Actions, GitLab CI) is a plus.
  • Familiarity with ticketing and support tools like Freshdesk, Jira, Postman, and Slack.


Compensation

  • As per Industry Standards


Interview Process:


  • Application- Submit your resume and complete your application via our job portal.
  • Screening-We’ll review your background and fit—typically invite you on cutshort for a Profile Evaluation call (15 mins)
  • Assignment Round- You'll receive a real-world take-home task to complete within 48 hours.
  • Panel Interview- Meet with a cross-functional interview panel to assess technical skills, problem-solving, and collaboration.
  • Stakeholder Interview- A focused discussion with the Director to evaluate strategic alignment and high-level fit.
  • HR Round- Final chat to discuss cultural fit, compensation, notice period, and next steps.


Personality Traits We Admir:

  • A confident and dynamic working persona, which can bring fun to the team, and a sense of humour, is an added advantage.
  • Great attitude to ask questions, learn and suggest process improvements.
  • Has attention to details and helps identify edge cases.
  • Highly motivated and coming up with fresh ideas and perspectives to help us move towards our goals faster.
  • Follow timelines and absolute commitment to deadlines.


Why Join Us:

  • Freedom & Responsibility: If you are a person who enjoys challenging work & pushing your boundaries, then this is the right place for you. We appreciate new ideas & ownership as well as flexibility with working hours.
  • Great Salary & Equity: We keep up with the market standards & provide pay packages considering updated standards. Also as Appknox continues to grow, you’ll have a great opportunity to earn more & grow with us. Moreover, we also provide equity options for our top performers.
  • Holistic Growth: We foster a culture of continuous learning and take a much more holistic approach to train and develop our assets: the employees. We shall also support you all on that journey of yours.
  • Transparency: Being a part of a start-up is an amazing experience, one of the reasons being open communication & transparency at multiple levels. Working with Appknox will give you the opportunity to experience it all first-hand.
Read more
Valuebound
Suchandni Verma
Posted by Suchandni Verma
Bengaluru (Bangalore)
5 - 15 yrs
₹35L - ₹60L / yr
Artificial Intelligence (AI)
Generative AI
Retrieval Augmented Generation (RAG)
Large Language Models (LLM)
skill iconPython

Job Overview

We are seeking an agile AI Engineer with a strong focus on both AI engineering and SaaS product development in a 0-1

product environment. This role is perfect for a candidate skilled in building and iterating quickly, embracing a fail fast

approach to bring innovative AI solutions to market rapidly. You will be responsible for designing, developing, and

deploying SaaS products using advanced Large Language Models (LLMs) such as Meta, Azure OpenAI, Claude, and Mistral,

while ensuring secure, scalable, and high-performance architecture. Your ability to adapt, iterate, and deliver in fast-

paced environments is critical.

Responsibilities

 Lead the design, development, and deployment of SaaS products leveraging LLMs, including platforms

like Meta, Azure OpenAI, Claude, and Mistral.

 Support product lifecycle, from conceptualization to deployment, ensuring seamless integration of AI

models with business requirements and user needs.

 Build secure, scalable, and efficient SaaS products that embody robust data management and comply

with security and governance standards.

 Collaborate closely with product management, and other stakeholders to align AI-driven SaaS solutions

with business strategies and customer expectations.

 Fine-tune AI models using custom instructions to tailor them to specific use cases and optimize

performance through techniques like quantization and model tuning.

 Architect AI deployment strategies using cloud-agnostic platforms (AWS, Azure, Google Cloud), ensuring

cost optimization while maintaining performance and scalability.

 Apply retrieval-augmented generation (RAG) techniques to build AI models that provide contextually

accurate and relevant outputs.

 Build the integration of APIs and third-party services into the SaaS ecosystem, ensuring robust and

flexible product architecture.

 Monitor product performance post-launch, iterating and improving models and infrastructure to

enhance user experience and scalability.

 Stay current with AI advancements, SaaS development trends, and cloud technology to apply innovative

solutions in product development.

Qualifications

 Bachelor’s degree or equivalent in Information Systems, Computer Science, or related fields.

 6+ years of experience in product development, with at least 2 years focused on AI-based SaaS

products.

 Demonstrated experience in leading the development of SaaS products, from ideation to deployment,

with a focus on AI-driven features.

 Hands-on experience with LLMs (Meta, Azure OpenAI, Claude, Mistral) and SaaS platforms.

 Proven ability to build secure, scalable, and compliant SaaS solutions, integrating AI with cloud-based

services (AWS, Azure, Google Cloud).

 Strong experience with RAG model techniques and fine-tuning AI models for business-specific needs.

 Proficiency in AI engineering, including machine learning algorithms, deep learning architectures (e.g.,

CNNs, RNNs, Transformers), and integrating models into SaaS environments.

 Solid understanding of SaaS product lifecycle management, including customer-focused design,

product-market fit, and post-launch optimization.

 Excellent communication and collaboration skills, with the ability to work cross-functionally and drive

SaaS product success.

 Knowledge of cost-optimized AI deployment and cloud infrastructure, focusing on scalability and

performance.

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai
4 - 8 yrs
₹7L - ₹26L / yr
SRE
Reliability engineering
skill iconAmazon Web Services (AWS)
skill iconPython

Job Title: Site Reliability Engineer (SRE)

Experience: 4+ Years

Work Location: Bangalore / Chennai / Pune / Gurgaon

Work Mode: Hybrid or Onsite (based on project need)

Domain Preference: Candidates with past experience working in shoe/footwear retail brands (e.g., Nike, Adidas, Puma) are highly preferred.


🛠️ Key Responsibilities

  • Design, implement, and manage scalable, reliable, and secure infrastructure on AWS.
  • Develop and maintain Python-based automation scripts for deployment, monitoring, and alerting.
  • Monitor system performance, uptime, and overall health using tools like Prometheus, Grafana, or Datadog.
  • Handle incident response, root cause analysis, and ensure proactive remediation of production issues.
  • Define and implement Service Level Objectives (SLOs) and Error Budgets in alignment with business requirements.
  • Build tools to improve system reliability, automate manual tasks, and enforce infrastructure consistency.
  • Collaborate with development and DevOps teams to ensure robust CI/CD pipelines and safe deployments.
  • Conduct chaos testing and participate in on-call rotations to maintain 24/7 application availability.


Must-Have Skills

  • 4+ years of experience in Site Reliability Engineering or DevOps with a focus on reliability, monitoring, and automation.
  • Strong programming skills in Python (mandatory).
  • Hands-on experience with AWS cloud services (EC2, S3, Lambda, ECS/EKS, CloudWatch, etc.).
  • Expertise in monitoring and alerting tools like Prometheus, Grafana, Datadog, CloudWatch, etc.
  • Strong background in Linux-based systems and shell scripting.
  • Experience implementing infrastructure as code using tools like Terraform or CloudFormation.
  • Deep understanding of incident management, SLOs/SLIs, and postmortem practices.
  • Prior working experience in footwear/retail brands such as Nike or similar is highly preferred.


Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Bengaluru (Bangalore)
5 - 8 yrs
₹4L - ₹25L / yr
Data engineering
skill iconPython
Spark

🛠️ Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Python and Apache Spark (PySpark or Scala APIs)
  • Develop and optimize ETL processes for batch and real-time data ingestion
  • Collaborate with data scientists, analysts, and DevOps teams to support data-driven solutions
  • Ensure data quality, integrity, and governance across all stages of the data lifecycle
  • Implement data validation, monitoring, and alerting mechanisms for production pipelines
  • Work with cloud platforms (AWS, GCP, or Azure) and tools like Airflow, Kafka, and Delta Lake
  • Participate in code reviews, performance tuning, and documentation


🎓 Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 3–6 years of experience in data engineering with a focus on Python and Spark
  • Experience with distributed computing and handling large-scale datasets (10TB+)
  • Familiarity with data security, PII handling, and compliance standards is a plus


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Poornima Varadarajan
Posted by Poornima Varadarajan
Bengaluru (Bangalore), Mumbai
5 - 7 yrs
Best in industry
API
skill iconJava
Banking
skill iconPython
API QA

Design, develop and maintain robust test automation frameworks for financial applications

 Create detailed test plans, test cases, and test scripts based on business requirements and user stories

 Execute functional, regression, integration, and API testing with a focus on financial data integrity

 Validate complex financial calculations, transaction processing, and reporting functionalities

 Collaborate with Business Analysts and development teams to understand requirements and ensure complete test coverage

 Implement automated testing solutions within CI/CD pipelines for continuous delivery

 Perform data validation testing against financial databases and data warehouses

 Identify, document, and track defects through resolution using defect management tools

 Verify compliance with financial regulations and industry standards

Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
3 - 6 yrs
₹6L - ₹11L / yr
skill iconPython
skill iconPostgreSQL
MySQL
skill iconDjango
skill iconAmazon Web Services (AWS)
+3 more

Senior Software Engineer - Backend


A Senior Software Backend Engineer is responsible for designing, building, and maintaining the server-side

logic and infrastructure of web applications or software systems. They typically work closely with frontend

engineers, DevOps teams, and other stakeholders to ensure that the back-end services perform optimally and

meet business requirements. Below is an outline of a typical Senior Backend Developer job profile:


Key Responsibilities:

1. System Architecture & Design:

- Design scalable, high-performance backend services and APIs.

- Participate in the planning, design, and development of new features.

- Ensure that systems are designed with fault tolerance, security, and scalability in mind.

2. Development & Implementation:

- Write clean, maintainable, and efficient code.

- Implement server-side logic, databases, and data storage solutions.

- Work with technologies like REST, GraphQL, and other backend communication methods.

- Design and optimize database schemas, queries, and indexes.

3. Performance Optimization:

- Diagnose and fix performance bottlenecks.

- Optimize backend processes and database queries for speed and efficiency.

- Implement caching strategies and load balancing.

4. Security:

- Ensure the security of the backend systems by implementing secure coding practices.

- Protect against common security threats such as SQL injection, cross-site scripting (XSS), and others.

5. Collaboration & Leadership:

- Collaborate with frontend teams, product managers, and DevOps engineers.

- Mentor junior developers and guide them in best practices.

- Participate in code reviews and ensure that the development team follows consistent coding standards.

6. Testing & Debugging:

- Develop and run unit, integration, and performance tests to ensure code quality.

- Troubleshoot, debug, and upgrade existing systems.

7. Monitoring & Maintenance:

- Monitor system performance and take preventive measures to ensure uptime and reliability.

- Maintain technical documentation for reference and reporting.

- Stay updated on emerging technologies and incorporate them into the backend tech stack.


Required Skills:

1. Programming Languages:

- Expertise in one or more backend programming languages in the list Python, Java, Go, Rust.

2. Database Management:

- Strong understanding of both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g.,

MongoDB, Redis).

- Knowledge of data modeling, query optimization, and database scaling strategies.

3. API Design & Development:

- Proficiency in designing and implementing RESTful and GraphQL APIs.

- Experience with microservices architecture.

- Good understanding of containers

4. Cloud & DevOps:

- Familiarity with cloud platforms like AWS, Azure, or Google Cloud.

- Understanding of DevOps principles, CI/CD pipelines, containerization (Docker), and orchestration

(Kubernetes).

5. Version Control:

- Proficiency with Git and branching strategies.

6. Testing & Debugging Tools:

- Familiarity with testing frameworks, debugging tools, and performance profiling.

7. Soft Skills:

- Strong problem-solving skills.

- Excellent communication and teamwork abilities.

- Leadership and mentorship qualities.


Qualifications:

- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field.

- 5+ years of experience in backend development or software engineering.

- Proven experience with system design, architecture, and high-scale application development.


Preferred Qualifications:

- Experience with distributed systems, event-driven architectures, and asynchronous processing.

- Familiarity with message queues (e.g., RabbitMQ, Kafka) and caching layers (e.g., Redis, Memcached).

- Knowledge of infrastructure as code (IaC) tools like Terraform or Ansible.


Tools & Technologies:

- Languages: Python, Java, Golang, Rust.

- Databases: PostgreSQL, MySQL, MongoDB, Redis, Cassandra.

- Frameworks: Django, Flask, Spring Boot, Go Micro.

- Cloud Providers: AWS, Azure, Google Cloud.

- Containerization: Docker, Kubernetes.

- CI/CD: Jenkins, GitLab CI, CircleCI.

This job profile will vary depending on the company and industry, but the core principles of designing,

developing, and maintaining back-end systems remain the same.

Read more
eazeebox

at eazeebox

3 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹15L / yr (Varies
)
skill iconPython
skill iconReact Native
SQL
NOSQL Databases
skill iconAmazon Web Services (AWS)

About Eazeebox

Eazeebox is India’s first specialized B2B platform for home electrical goods. We simplify supply chain logistics and empower electrical retailers through our one-stop digital platform — offering access to 100+ brands across 15+ categories, no MOQs, flexible credit options, and 4-hour delivery. We’re on a mission to bring technological inclusion to India's massive electrical retail industry.


Role Overview

We’re looking for a hands-on Full Stack Engineer who can build scalable backend systems using Python and mobile applications using React Native. You’ll work directly with the founder and a lean engineering team to architect and deliver core modules across our Quick Commerce stack – including retailer apps, driver apps, order management systems, and more.


What You’ll Do

  • Develop and maintain backend services using Python
  • Build and ship high-performance React Native apps for Android and iOS
  • Collaborate on API design, microservices, and systems integration
  • Ensure performance, reliability, and scalability across the stack
  • Contribute to decisions on re-engineering, tech stack, and infra setup
  • Work closely with the founder and product team to own end-to-end delivery
  • Participate in collaborative working sessions and pair programming when needed


What We’re Looking For

  • Strong proficiency in Python for backend development
  • Experience building mobile apps with React Native
  • Solid understanding of microservices architecture, API layers, and shared data models
  • Familiarity with AWS or equivalent cloud platforms
  • Exposure to Docker, Kubernetes, and CI/CD pipelines
  • Ability to thrive in a fast-paced, high-ownership environment


Good-to-Have (Bonus Points)

  • Experience working in Quick Commerce, logistics, or consumer apps
  • Knowledge of PIM (Product Information Management) systems
  • Understanding of key commerce algorithms (search, ranking, filtering, order management)
  • Ability to use AI-assisted coding tools to speed up development


Why Join Us

  • Build from scratch, not maintain legacy
  • Work directly with the founder and influence tech decisions
  • Shape meaningful digital infrastructure for a $35B+ industry
Read more
Peliqan

at Peliqan

3 recruiters
Bharath Kumar
Posted by Bharath Kumar
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹12L / yr
skill iconPython
SQL
API


About the Role


We are looking for a Python Developer with expertise in data synchronization (ETL & Reverse ETL), automation workflows, AI functionality, and connectivity to work directly with a customer in Peliqan. In this role, you will be responsible for building seamless integrations, enabling AI-driven functionality, and ensuring data flows smoothly across various systems.

Key Responsibilities

  • Build and maintain data sync pipelines (ETL & Reverse ETL) to ensure seamless data transfer between platforms.
  • Develop automation workflows to streamline processes and improve operational efficiency.
  • Implement AI-driven functionality, including AI-powered analytics, automation, and decision-making capabilities.
  • Build and enhance connectivity between different data sources, APIs, and enterprise applications.
  • Work closely with the customer to understand their technical needs and design tailored solutions in Peliqan.
  • Optimize performance of data integrations and troubleshoot issues as they arise.
  • Ensure security and compliance in data handling and integrations.

Requirements

  • Strong experience in Python and related libraries for data processing & automation.
  • Expertise in ETL, Reverse ETL, and workflow automation tools.
  • Experience working with APIs, data connectors, and integrations across various platforms.
  • Familiarity with AI & machine learning concepts and their practical application in automation.
  • Hands-on experience with Peliqan or similar integration/data automation platforms is a plus.
  • Strong problem-solving skills and the ability to work directly with customers to define and implement solutions.
  • Excellent communication and collaboration skills.

Preferred Qualifications

  • Experience in SQL, NoSQL databases, and cloud platforms (AWS, GCP, Azure).
  • Knowledge of data governance, security best practices, and performance optimization.
  • Prior experience in customer-facing engineering roles.

If you’re a Python & Integration Engineer who loves working on cutting-edge AI, automation, and data connectivity projects, we’d love to hear from you


Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Mumbai, Hyderabad, Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹6L - ₹25L / yr
skill iconPython
skill iconDjango
NumPy
skill iconFlask
pandas
+1 more

Python Developer Job Description

A Python Developer is responsible for designing, developing, and deploying software applications using the Python programming language. Here's a brief overview:


Key Responsibilities

- Software Development: Develop high-quality software applications using Python.

- Problem-Solving: Solve complex problems using Python programming language.

- Code Maintenance: Maintain and update existing codebases to ensure they remain efficient and scalable.

- Collaboration: Collaborate with cross-functional teams to identify and prioritize project requirements.

- Testing and Debugging: Write unit tests and debug applications to ensure high-quality code.


Technical Skills

- Python: Strong understanding of Python programming language and its ecosystem.

- Programming Fundamentals: Knowledge of programming fundamentals, including data structures, algorithms, and object-oriented programming.

- Frameworks and Libraries: Familiarity with popular Python frameworks and libraries, such as Django, Flask, or Pandas.

- Database Management: Understanding of database management systems, including relational databases and NoSQL databases.

- Version Control: Knowledge of version control systems, including Git.


Read more
Coimbatore, Bengaluru (Bangalore), Mumbai
1 - 4 yrs
₹3.4L - ₹5L / yr
skill iconPython
skill iconJavascript
skill iconJava
skill iconHTML/CSS
Big Data
+2 more

The Assistant Professor in CSE will teach undergraduate and graduate courses, conduct independent and collaborative research, mentor students, and contribute to departmental and institutional service.

Read more
Hyderabad, Bengaluru (Bangalore), Mumbai, Delhi, Pune, Chennai
0 - 1 yrs
₹10L - ₹20L / yr
skill iconPython
Object Oriented Programming (OOPs)
skill iconJavascript
skill iconJava
Data Structures
+1 more


About NxtWave


NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.

Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.

Know more:

🌐 NxtWave | NIAT

About the Role

As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.


Key Responsibilities

  • Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
  • Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
  • Mentor students in academic, career, and project development goals.
  • Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
  • Drive research-led content development, and contribute to innovation in teaching methodologies.
  • Support capstone projects, hackathons, and collaborative research opportunities with industry.
  • Foster a high-performance learning environment in classes of 70–100 students.
  • Collaborate with cross-functional teams for continuous student development and program quality.
  • Actively participate in faculty training, peer reviews, and academic audits.


Eligibility & Requirements

  • Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
  • Strong academic and research orientation, preferably with publications or project contributions.
  • Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
  • A deep commitment to education, student success, and continuous improvement.

Must-Have Skills

  • Expertise in Python, Java, JavaScript, and advanced programming paradigms.
  • Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
  • Excellent communication, classroom delivery, and presentation skills.
  • Familiarity with academic content tools like Google Slides, Sheets, Docs.
  • Passion for educating, mentoring, and shaping future developers.

Good to Have

  • Industry experience or consulting background in software development or research-based roles.
  • Proficiency in version control systems (e.g., Git) and agile methodologies.
  • Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
  • A drive to innovate in teaching, curriculum design, and student engagement.

Why Join Us?

  • Be at the forefront of shaping India’s tech education revolution.
  • Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
  • Competitive compensation with strong growth potential.
  • Create impact at scale by mentoring hundreds of future-ready tech leaders.


Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹30L / yr
Hadoop
Apache Kafka
Spark
skill iconPython
skill iconJava
+8 more

Senior Data Engineer Job Description

Overview

The Senior Data Engineer will design, develop, and maintain scalable data pipelines and

infrastructure to support data-driven decision-making and advanced analytics. This role requires deep

expertise in data engineering, strong problem-solving skills, and the ability to collaborate with

cross-functional teams to deliver robust data solutions.

Key Responsibilities


Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data

pipelines to ingest, process, and transform large volumes of structured and unstructured data.

Data Architecture: Architect and maintain data storage solutions, including data lakes, data

warehouses, and databases, ensuring performance, scalability, and cost-efficiency.

Data Integration: Integrate data from diverse sources, including APIs, third-party systems,

and streaming platforms, ensuring data quality and consistency.

Performance Optimization: Monitor and optimize data systems for performance, scalability,

and cost, implementing best practices for partitioning, indexing, and caching.

Collaboration: Work closely with data scientists, analysts, and software engineers to

understand data needs and deliver solutions that enable advanced analytics, machine

learning, and reporting.

Data Governance: Implement data governance policies, ensuring compliance with data

security, privacy regulations (e.g., GDPR, CCPA), and internal standards.

Automation: Develop automated processes for data ingestion, transformation, and validation

to improve efficiency and reduce manual intervention.

Mentorship: Guide and mentor junior data engineers, fostering a culture of technical

excellence and continuous learning.

Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high

availability and reliability of data systems.

Required Qualifications

Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,

or a related field.

Experience: 5+ years of experience in data engineering or a related role, with a proven track

record of building scalable data pipelines and infrastructure.

Technical Skills:

Proficiency in programming languages such as Python, Java, or Scala.

Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).

Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services

(e.g., Redshift, BigQuery, Snowflake).

Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and

data integration frameworks.

Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed

systems.

Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a

plus.

Soft Skills:

Excellent problem-solving and analytical skills.

Strong communication and collaboration abilities.

Ability to work in a fast-paced, dynamic environment and manage multiple priorities.

Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,

Google Professional Data Engineer) or relevant data engineering certifications.

Preferred Qualifica

Experience with real-time data processing and streaming architectures.

Familiarity with machine learning pipelines and MLOps practices.

Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data

pipelines.

Experience in industries with high data complexity, such as finance, healthcare, or

e-commerce.

Work Environment

Location: Hybrid/Remote/On-site (depending on company policy).

Team: Collaborative, cross-functional team environment with data scientists, analysts, and

business stakeholders.

Hours: Full-time, with occasional on-call responsibilities for critical data systems.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Remote, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
SDET
BDD
SQL
Data Warehouse (DWH)
+2 more

Primary skill set: QA Automation, Python, BDD, SQL 

As Senior Data Quality Engineer you will:

  • Evaluate product functionality and create test strategies and test cases to assess product quality.
  • Work closely with the on-shore and the offshore team.
  • Work on multiple reports validation against the databases by running medium to complex SQL queries.
  • Better understanding of Automation Objects and Integrations across various platforms/applications etc.
  • Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
  • Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
  • Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
  • Establish processes and tools set to maintain automation scripts and generate regular test reports.
  • Peer review to provide feedback and to make sure the test scripts are flaw-less.

Core/Must have skills:

  • Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
  • Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
  • Clear & crisp communication and commitment towards deliverables
  • Experience on BigData Testing will be an added advantage.
  • Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.

Good to have skills:

  • Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
  • Ability to effectively articulate technical challenges and solutions
  • Work experience in qTest, Jira, WebDriver.IO


Read more
Robylon AI

at Robylon AI

2 candid answers
Listings Robylon
Posted by Listings Robylon
Bengaluru (Bangalore)
0 - 2 yrs
₹5L - ₹6L / yr
skill iconPython
Generative AI
Prompt engineering

Role Overview

This is a 20% technical, 80% non-technical role designed for individuals who can blend technical know-how with strong operational and communication skills. You’ll be the bridge between our product and the client’s operations team.


Key Responsibilities


  • Collaborate with clients to co-design SOPs for resolving support queries across channels (chat, ticket, voice)
  • Scope and plan each integration: gather technical and operational requirements and convert them into an executable timeline with measurable success metrics (e.g., coverage %, accuracy, CSAT)
  • Lead integration rollouts and post-launch success loops: monitor performance, debug issues, fine-tune prompts and workflows
  • Conduct quarterly “AI health-checks” and continuously improve system effectiveness
  • Troubleshoot production issues, replicate bugs, ship patches, and write clear root-cause analyses (RCAs)
  • Act as the customer’s voice internally, channel key insights to product and engineering teams


Must-Have Qualifications


  • Engineering degree is a must; Computer Science preferred
  • Past experience in coding and a sound understanding of APIs is preferred
  • Ability to communicate clearly with both technical and non-technical stakeholders
  • Experience working in SaaS, customer success, implementation, or operations roles
  • Analytical mindset with the ability to make data-driven decisions



Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.
Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
Intellikart Ventures LLP
ramandeep intellikart
Posted by ramandeep intellikart
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹25L / yr
Langchaing
langgraph
Linux kernel
LLMs
Prompt engineering
+3 more

Job Summary:

We are hiring a Data Scientist – Gen AI with hands-on experience in developing Agentic AI applications using frameworks like LangChain, LangGraph, Semantic Kernel, or Microsoft Copilot. The ideal candidate will be proficient in Python, LLMs, and prompt engineering techniques such as RAG and Chain-of-Thought prompting.


Key Responsibilities:

  • Build and deploy Agent AI applications using LLM frameworks.
  • Apply advanced prompt engineering (Zero-Shot, Few-Shot, CoT).
  • Integrate Retrieval-Augmented Generation (RAG).
  • Develop scalable solutions in Python using NumPy, Pandas, TensorFlow/PyTorch.
  • Collaborate with teams to deliver business-aligned Gen AI solutions.


Must-Have Skills:

  • Experience with LangChain, LangGraph, or similar (priority given).
  • Strong understanding of LLMs, RAG, and prompt engineering.
  • Proficiency in Python and relevant ML libraries.


Nice-to-Have:

  • Wrapper API development for LLMs.
  • REST API integration within Agentic workflows.


Qualifications:

  • Bachelor’s/Master’s in CS, Data Science, AI, or related.
  • 4–7 years in AI/ML/Data Science, with 1–2 years in Gen AI/LLMs.
Read more
Edstellar.com

at Edstellar.com

2 candid answers
partha Sarathy
Posted by partha Sarathy
Bengaluru (Bangalore)
0 - 0 yrs
₹3L - ₹3L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconPython
skill iconGit
Version Control
+3 more

Greetings from Edstellar

we are looking for Vibe Coder for entry Level


Position Overview

We're seeking passionate fresh graduates who are natural Vibe Coders - developers who code with intuition, creativity, and genuine enthusiasm for building amazing applications. Perfect for recent grads who bring fresh energy and innovative thinking to development.


Key Responsibilities

Build dynamic web and mobile applications with creative flair

Code with passion and embrace experimental approaches

Learn and implement emerging technologies rapidly

Collaborate in our innovation-friendly environment

Prototype ideas and iterate with speed and creativity

Bring fresh perspectives to development challenges


Required Qualifications

Education: Bachelor's in Computer Science/IT or related field

Experience: Fresh graduate (0-1 years)


Technical Skills:

Solid programming fundamentals (any language)

Basic web development (HTML, CSS, JavaScript)

Understanding of application development concepts

Familiarity with Git/version control

Creative problem-solving mindset


Preferred:

Good understanding in Python, JavaScript frameworks, or modern tech stack

AI tool familiarity

Mobile development interest

Open source contributions


Vibe Coder DNA

Passionate about coding and building innovative apps

Thrives with creative freedom and flexible approaches

Loves experimenting with new technologies

Values innovation and thinking outside the box

Natural curiosity and eagerness to learn

Collaborative spirit with independent drive

Resilient and adaptable to change



Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
IndArka Energy Pvt Ltd

at IndArka Energy Pvt Ltd

3 recruiters
Mita Hemant
Posted by Mita Hemant
Bengaluru (Bangalore)
3 - 4 yrs
₹18L - ₹20L / yr
skill iconPython
skill iconDjango
Data Structures
Algorithms

About us

Arka energy is focussed on changing the paradigm on energy. Arka focusses on creating innovative renewable energy solutions for residential customers. With its custom product design and an innovative approach to market the product solution, Arka aims to be a leading provider of energy solutions in the residential solar segment. Arka designs and develops end to end renewable energy solutions with teams in Bangalore and in the Bay area

This product is a 3d simulation software, to replicate rooftops/commercial sites, place solar panels and generate the estimation of solar energy.

What are we looking for?

·        As a backend developer you will be responsible for developing solutions that will enable Arka solutions to be easily adopted by customers.

·        Attention to detail and willingness to learn is a big part of this position.

·        Commitment to problem solving, and innovative design approaches are important.

Role and responsibilities

●       Develop cloud-based Python Django software products

●       Working closely with UX and Front-end Developers

●       Participating in architectural, design and product discussions Designing and creating RESTful APIs for internal and partner consumption

●       Working in an agile environment with an excellent team of engineers

●       Own/maintain code everything from development to fixing bugs/issues.

●       Deliver clean, reusable high-quality code

●       Facilitate problem diagnosis and resolution for issues reported by Customers

●       Deliver to schedule and timelines based on an Agile/Scrum-based approach

●       Develop new features and ideas to make product better and user centric.

●       Must be able to independently write code and test major features, as well as work jointly with other team members to deliver complex changes

●       Create algorithms from scratch and implement them in the software.

●       Code Review, End to End Unit Testing.

●       Guiding and monitoring Junior Engineers.



SKILL REQUIREMENTS

●       Solid database skills in a relational database (i.e. PostgresSQL, MySQL, etc.) Knowledge of how to build and use with RESTful APIs

●        Strong knowledge of version control (i.e. git, svn, etc.)

●        Experience deploying Python applications into production

●        Azure or Google cloud infrastructure knowledge is a plus

●       Strong drive to learn new technologies

●       Ability to learn new technologies quickly

●       Continuous look-out for new and creative solutions to implement new features or improve old ones

●       Data Structures, Algorithms, Django and Python

 

 

 

Good To have

·        Knowledge on GenAI Applications.

 

 

Key Benefits

·        Competitive development environment

·        Engagement into full scale systems development

·        Competitive Salary

·        Flexible working environment

·        Equity in an early-stage start-up

·        Patent Filing Bonuses

·        Health Insurance for Employee + Family

 

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Hyderabad, Bengaluru (Bangalore), Pune
6 - 11 yrs
₹8L - ₹26L / yr
skill iconData Science
skill iconPython
Large Language Models (LLM)
Natural Language Processing (NLP)

POSITION / TITLE: Data Science Lead

Location: Offshore – Hyderabad/Bangalore/Pune

Who are we looking for?

Individuals with 8+ years of experience implementing and managing data science projects. Excellent working knowledge of traditional machine learning and LLM techniques. 

‎ The candidate must demonstrate the ability to navigate and advise on complex ML ecosystems from a model building and evaluation perspective. Experience in NLP and chatbots domains is preferred.

We acknowledge the job market is blurring the line between data roles: while software skills are necessary, the emphasis of this position is on data science skills, not on data-, ML- nor software-engineering.

Responsibilities:

· Lead data science and machine learning projects, contributing to model development, optimization and evaluation. 

· Perform data cleaning, feature engineering, and exploratory data analysis.  

· Translate business requirements into technical solutions, document and communicate project progress, manage non-technical stakeholders.

· Collaborate with other DS and engineers to deliver projects.

Technical Skills – Must have:

· Experience in and understanding of the natural language processing (NLP) and large language model (LLM) landscape.

· Proficiency with Python for data analysis, supervised & unsupervised learning ML tasks.

· Ability to translate complex machine learning problem statements into specific deliverables and requirements.

· Should have worked with major cloud platforms such as AWS, Azure or GCP.

· Working knowledge of SQL and no-SQL databases.

· Ability to create data and ML pipelines for more efficient and repeatable data science projects using MLOps principles.

· Keep abreast with new tools, algorithms and techniques in machine learning and works to implement them in the organization.

· Strong understanding of evaluation and monitoring metrics for machine learning projects.

Technical Skills – Good to have:

· Track record of getting ML models into production

· Experience building chatbots.

· Experience with closed and open source LLMs.

· Experience with frameworks and technologies like scikit-learn, BERT, langchain, autogen…

· Certifications or courses in data science.

Education:

· Master’s/Bachelors/PhD Degree in Computer Science, Engineering, Data Science, or a related field. 

Process Skills:

· Understanding of  Agile and Scrum  methodologies.  

· Ability to follow SDLC processes and contribute to technical documentation.  

Behavioral Skills :

· Self-motivated and capable of working independently with minimal management supervision.

· Well-developed design, analytical & problem-solving skills

· Excellent communication and interpersonal skills.  

· Excellent team player, able to work with virtual teams in several time zones.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Bhavya M
Posted by Bhavya M
Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Chef
skill iconPython

Key Responsibilities:

· Lead the design and implementation of scalable infrastructure using IaC principles.

· Develop and manage configuration management tools primarily with Chef.

· Write and maintain automation scripts in Python to streamline infrastructure tasks.

· Build, manage, and version infrastructure using Terraform.

· Collaborate with cloud architects and DevOps teams to ensure highly available, secure, and scalable systems.

· Provide guidance and mentorship to junior engineers.

· Monitor infrastructure performance and provide optimization recommendations.

· Ensure compliance with best practices for security, governance, and automation.

· Maintain and improve CI/CD pipelines with infrastructure integration.

· Support incident management, troubleshooting, and root cause analysis for infrastructure issues.


Required Skills & Experience:

· Strong hands-on experience in:

o Chef (Cookbooks, Recipes, Automation)

o Python (Scripting, automation tasks, REST APIs)

o Terraform (Modules, state management, deployments)

· Experience in AWS services (EC2, VPC, IAM, S3, etc.)

· Familiarity with Windows administration and automation.

· Solid understanding of CI/CD processes, infrastructure lifecycle, and Git-based workflow

Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai, Bengaluru (Bangalore)
1 - 6 yrs
₹15L - ₹30L / yr
MLOps
MLFlow
kubeflow
Windows Azure
skill iconMachine Learning (ML)
+4 more

About Moative

Moative, an Applied AI Services company, designs AI roadmaps, builds co-pilots and predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries. Through Moative Labs, we aspire to build micro-products and launch AI startups in vertical markets.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Role

We seek experienced ML/AI professionals with strong backgrounds in computer science, software engineering, or related elds to join our Azure-focused MLOps team. If you’re passionate about deploying complex machine learning models in real-world settings, bridging the gap between research and production, and working on high-impact projects, this role is for you.


Work you’ll do

As an operations engineer, you’ll oversee the entire ML lifecycle on Azure—spanning initial proofs-of-concept to large-scale production deployments. You’ll build and maintain automated training, validation, and deployment pipelines using Azure DevOps, Azure ML, and related services, ensuring models are continuously monitored, optimized for performance, and cost-eective. By integrating MLOps practices such as MLow and CI/CD, you’ll drive rapid iteration and experimentation. In close collaboration with senior ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade ML solutions that directly impact business outcomes. 


Responsibilities

  • ML-focused DevOps: Set up robust CI/CD pipelines with a strong emphasis on model versioning, automated testing, and advanced deployment strategies on Azure.
  • Monitoring & Maintenance: Track and optimize the performance of deployed models through live metrics, alerts, and iterative improvements.
  • Automation: Eliminate repetitive tasks around data preparation, model retraining, and inference by leveraging scripting and infrastructure as code (e.g., Terraform, ARM templates).
  • Security & Reliability: Implement best practices for securing ML workows on Azure, including identity/access management, container security, and data encryption.
  • Collaboration: Work closely with the data science teams to ensure model performance is within agreed SLAs, both for training and inference.


Skills & Requirements

  • 2+ years of hands-on programming experience with Python (PySpark or Scala optional).
  • Solid knowledge of Azure cloud services (Azure ML, Azure DevOps, ACI/AKS).
  • Practical experience with DevOps concepts: CI/CD, containerization (Docker, Kubernetes), infrastructure as code (Terraform, ARM templates).
  • Fundamental understanding of MLOps: MLow or similar frameworks for tracking and versioning.
  • Familiarity with machine learning frameworks (TensorFlow, PyTorch, XGBoost) and how to operationalize them in production.
  • Broad understanding of data structures and data engineering.


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, eiciency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less.


Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, x or improve – anything that isn’t done right, irrespective of who did it. Be selsh about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply here. We encourage you to apply even if you believe you do not meet all the requirements listed above.


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers.


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Potentiam
karishma raj
Posted by karishma raj
Bengaluru (Bangalore)
6 - 12 yrs
₹22L - ₹30L / yr
skill iconPython
skill iconDjango

About Potentiam

Potentiam helps SME companies build world-class offshore teams. Our model is our locations and your dedicated staff under your control. Potentiam have offices in Iasi in Romania, Bangalore and Cape Town, home to large liquid pools of offshore talent working for international companies. Potentiam's management team have had over 15 years' experience in building offshore teams, and have specialist functional expertise to support the transition offshore in technology, finance, operations, engineering, digital marketing, technology and analytics. For decades corporations' scale has enabled them to benefit from the cost and skills advantage of offshore operations. Now SME companies can enjoy a similar benefit through Potentiam without any upfront investment.


Location : Bangalore ( Hybrid)


Experience - 6+ Years



Professional Experience:

  • Experience using a Python backend web framework (like Django, Flask or FastAPI)
  • In particular, experience building performant and reliable APIs and integrations
  • Competency using SQL and ORMs
  • Some experience with frontend web development would be a bonus using a JavaScript framework (such as Vue.js or React)
  • Understanding of some of the following: Django Rest Framework, PostgreSQL, Celery, Docker, nginx, AWS

Benefits and Perks

  • Health Insurance
  • Referral Bonus
  • Performance Bonus
  • Flexible Working options


Job Types: Full-time, Permanent


Read more
Potentiam
Dipanjan Das
Posted by Dipanjan Das
Bengaluru (Bangalore)
5 - 10 yrs
₹25L - ₹35L / yr
skill iconPython
machine Learning models
NumPy
skill iconDocker

● Proven experience in training, evaluating and deploying machine learning models

● Solid understanding of data science and machine learning concepts

● Experience with some machine learning / data engineering machine learning tech in Python (such as numpy, pytorch, pandas/polars, airflow, etc)

● Experience developing data products using large language model, prompt engineering, model evaluation.

● Experience with web services and programming (such as Python, docker, databases etc.)  

● Understanding of some of the following: FastAPI, PostgreSQL, Celery, Docker, AWS, Modal, git, continuous integration. 

Read more
Codemonk

at Codemonk

4 candid answers
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
2yrs+
Upto ₹12L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
SQL
NOSQL Databases
+3 more

About Role

We are seeking a skilled Backend Engineer with 2+ years of experience to join our dynamic team, focusing on building scalable web applications using Python frameworks (Django/FastAPI) and cloud technologies. You'll be instrumental in developing and maintaining our cloud-native backend services.


Responsibilities:

  1. Design and develop scalable backend services using Django and FastAPI
  2. Create and maintain RESTful APIs
  3. Implement efficient database schemas and optimize queries
  4. Implement containerisation using Docker and container orchestration
  5. Design and implement cloud-native solutions using microservices architecture
  6. Participate in technical design discussions, code reviews and maintain coding standards
  7. Document technical specifications and APIs
  8. Collaborate with cross-functional teams to gather requirements, prioritise tasks, and contribute to project completion.

Requirements:

  1. Experience with Django and/or Fast-API (2+ years)
  2. Proficiency in SQL and ORM frameworks
  3. Docker containerisation and orchestration
  4. Proficiency in shell scripting (Bash/Power-Shell)
  5. Understanding of micro-services architecture
  6. Experience building server-less back end
  7. Knowledge of deployment and debugging on cloud platforms (AWS/Azure)
Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 8 yrs
₹6L - ₹15L / yr
SQL
skill iconPython
skill iconData Analytics
Business Intelligence (BI)

Role Overview

We’re looking for a Data Analyst who is excited to work at the intersection of data, technology, and women’s wellness. You'll be instrumental in helping us understand user behaviour, community engagement, campaign performance, and product usage across platforms — including app, web, and WhatsApp.

You’ll also have opportunities to collaborate on AI-powered features such as chatbots and personalized recommendations. Experience with GenAI or NLP is a plus but not a requirement.

 

Key Responsibilities

·        Clean, transform, and analyse data from multiple sources (SQL databases, CSVs, APIs).

·        Build dashboards and reports to track KPIs, user behaviour, and marketing performance.

·        Collaborate with product, marketing, and customer teams to uncover actionable insights.

·        Support experiments, A/B testing, and cohort analysis to drive growth and retention.

·        Assist in documentation and communication of findings to technical and non-technical teams.

·        Work with the data team to enhance personalization and AI features (optional).

 

Required Qualifications

·        Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.

·        2 – 4 years of experience in data analysis or business intelligence.

·        Strong hands-on experience with SQL and Python (pandas, NumPy, matplotlib).

·        Familiarity with data visualization tools (Streamlit, Tableau, Metabase, Power BI, etc.)

·        Ability to translate complex data into simple visual stories and clear recommendations.

·        Strong attention to detail and a mindset for experimentation.

 

Preferred (Not Mandatory)

·        Exposure to GenAI, LLMs (e.g., OpenAI, HuggingFace), or NLP concepts.

·        Experience working with healthcare, wellness, or e-commerce datasets.

·        Familiarity with REST APIs, JSON structures, or chatbot systems.

·        Interest in building tools that impact women’s health and wellness. 


Why Join Us?

·        Be part of a high-growth startup tackling a real need in women’s healthcare.

·        Work with a passionate, purpose-driven team.

·        Opportunity to grow into GenAI/ML-focused roles as we scale.

·        Competitive salary and career progression

 

 

Best Regards,

Indrani Dutta

MIROR THERAPEUTICS PRIVATE LIMITED

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore)
8 - 12 yrs
Best in industry
skill iconPython
Terraform
Chef

Job Summary:

The Lead IaC Engineer will design, implement, automate, and maintain infrastructure across on-premises and cloud environments. This role should have strong hands-on expertise in Chef, Python, Terraform, and some AWS & Windows administration knowledge.


8-12 years of experience

Primary Skills – Chef, Python, and Terraform

Secondary – AWS & Windows admin (Cloud is not mandatory)

Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹10L - ₹24L / yr
skill iconPython
FastAPI
skill iconFlask
API management
RESTful APIs
+8 more

Job Title : Python Developer – API Integration & AWS Deployment

Experience : 5+ Years

Location : Bangalore

Work Mode : Onsite


Job Overview :

We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.

The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.


Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.


Key Responsibilities :

Python Development & API Integration :

  • Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
  • Automate simulations and workflows using the PSS®E Python API (psspy).
  • Implement robust bulk case processing, result extraction, and automated reporting systems.


AWS Cloud Deployment :

  • Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
  • Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
  • Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.


Required Skills :

  • 5+ Years of professional experience in Python development.
  • Hands-on experience with RESTful API development (FastAPI/Flask).
  • Solid experience working with PSS®E and its psspy Python API.
  • Strong understanding of AWS services, deployment, and best practices.
  • Proficiency in automation, scripting, and report generation.
  • Knowledge of cloud security and monitoring tools like IAM and CloudWatch.

Good to Have :

  • Experience in power system simulation and electrical engineering concepts.
  • Familiarity with CI/CD tools for AWS deployments.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort