Cutshort logo
Python Jobs in Chennai

50+ Python Jobs in Chennai | Python Job openings in Chennai

Apply to 50+ Python Jobs in Chennai on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
BOSCLE

at BOSCLE

2 recruiters
Rhea Fernandes
Posted by Rhea Fernandes
Chennai
2 - 3 yrs
₹4L - ₹6L / yr
skill iconLaravel
skill icontailwindcss
skill iconHTML/CSS
skill iconJavascript
skill iconReact.js
+9 more

Must-Have Skills


Frontend

  • HTML, CSS, JavaScript
  • React.js + Redux Toolkit
  • Tailwind CSS

Backend (Primary)

  • JavaScript with Express.js or Koa or Fastify

Database

  • PostgreSQL or SQL

Good-to-Have Skills

  • Frontend: Material UI, Shadcn UI, Zustand
  • Chart libraries: Recharts, Chart.js
  • PDF libraries: jsPDF, React PDF

Alternate Backend Option (Secondary)

Python with Django, Flask, or FastAPI

(JavaScript backend experience is preferred, Python is optional.)


Who You Are

  • You can take a Figma or wireframe and turn it into a flawless, functional product.
  • You write maintainable, modular code that others can understand.
  • You love building features that look good and just work.
  • You’re self-driven, proactive, and okay with working in a startup environment where speed matters.

You Should Be:

  • 2+ years of hands-on experience in Laravel (PHP) / Node.js / Python
  • Strong command over React.js, REST APIs, and GitHub version control
  • Ability to write clean, modular, and maintainable code
  • Experience turning Figma/wireframes into production-ready products
  • Problem-solving mindset — thinking like a founder, not just a developer
  • Familiarity with Tailwind CSS (must-have)

Bonus: Node.js, DevOps setups, or Angular


Bonus If:

  • Experience in SaaS product development
  • Knowledge of CI/CD pipelines, Docker, or basic DevOps
Read more
Resulticks
Sagadevan Ramamoorthy
Posted by Sagadevan Ramamoorthy
Chennai
3 - 5 yrs
₹8L - ₹13L / yr
skill iconPython
pandas
NumPy
skill iconFlask

What you’ll do here:

 

  • Develop and maintain software applications using Python
  • Collaborate with cross-functional teams to define software requirements and design specifications
  • Conduct code reviews and provide constructive feedback to team members
  • Troubleshoot and debug software issues, identify root causes, and implement effective solutions
  • Contribute to the design and architecture of software systems
  • Perform unit testing and integration testing to ensure software quality and reliability
  • Keep up-to-date with the latest trends and best practices in software development
  • Create and maintain detailed technical documentation for system designs, processes, and applications
  • Mentor junior developers and provide technical guidance to ensure the delivery of high-quality solutions.

 

 

 

What you will need to thrive:

 

  • Bachelor's degree in Computer Science or a related field
  • 3+ years of Proven experience as a Python Engineer or similar role
  • Strong understanding of relational databases like MySQL and NoSQL
  • Experience with software development methodologies and best practices
  • Solid knowledge of relational databases and SQL
  • Exposure to front-end technologies such as JavaScript and React
  • Flexibility to adapt to changing priorities and handle multiple tasks simultaneously
  • Proven experience in mentoring junior developers and fostering a culture of continuous learning.
  • Attention to detail and a commitment to delivering high-quality software solutions
Read more
Pluginlive

at Pluginlive

1 recruiter
Harsha Saggi
Posted by Harsha Saggi
Chennai, Mumbai
4 - 6 yrs
₹10L - ₹20L / yr
skill iconPython
SQL
NOSQL Databases
Data architecture
Data modeling
+7 more

Role Overview:

We are seeking a talented and experienced Data Architect with strong data visualization capabilities to join our dynamic team in Mumbai. As a Data Architect, you will be responsible for designing, building, and managing our data infrastructure, ensuring its reliability, scalability, and performance. You will also play a crucial role in transforming complex data into insightful visualizations that drive business decisions. This role requires a deep understanding of data modeling, database technologies (particularly Oracle Cloud), data warehousing principles, and proficiency in data manipulation and visualization tools, including Python and SQL.


Responsibilities:

  • Design and implement robust and scalable data architectures, including data warehouses, data lakes, and operational data stores, primarily leveraging Oracle Cloud services.
  • Develop and maintain data models (conceptual, logical, and physical) that align with business requirements and ensure data integrity and consistency.
  • Define data governance policies and procedures to ensure data quality, security, and compliance.
  • Collaborate with data engineers to build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and loading.
  • Develop and execute data migration strategies to Oracle Cloud.
  • Utilize strong SQL skills to query, manipulate, and analyze large datasets from various sources.
  • Leverage Python and relevant libraries (e.g., Pandas, NumPy) for data cleaning, transformation, and analysis.
  • Design and develop interactive and insightful data visualizations using tools like [Specify Visualization Tools - e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly] to communicate data-driven insights to both technical and non-technical stakeholders.
  • Work closely with business analysts and stakeholders to understand their data needs and translate them into effective data models and visualizations.
  • Ensure the performance and reliability of data visualization dashboards and reports.
  • Stay up-to-date with the latest trends and technologies in data architecture, cloud computing (especially Oracle Cloud), and data visualization.
  • Troubleshoot data-related issues and provide timely resolutions.
  • Document data architectures, data flows, and data visualization solutions.
  • Participate in the evaluation and selection of new data technologies and tools.


Qualifications:

  • Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field.
  • Proven experience (typically 5+ years) as a Data Architect, Data Modeler, or similar role. 

  • Deep understanding of data warehousing concepts, dimensional modeling (e.g., star schema, snowflake schema), and ETL/ELT processes.
  • Extensive experience working with relational databases, particularly Oracle, and proficiency in SQL.
  • Hands-on experience with Oracle Cloud data services (e.g., Autonomous Data Warehouse, Object Storage, Data Integration).
  • Strong programming skills in Python and experience with data manipulation and analysis libraries (e.g., Pandas, NumPy).
  • Demonstrated ability to create compelling and effective data visualizations using industry-standard tools (e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly).
  • Excellent analytical and problem-solving skills with the ability to interpret complex data and translate it into actionable insights. 
  • Strong communication and presentation skills, with the ability to effectively communicate technical concepts to non-technical audiences. 
  • Experience with data governance and data quality principles.
  • Familiarity with agile development methodologies.
  • Ability to work independently and collaboratively within a team environment.

Application Link- https://forms.gle/km7n2WipJhC2Lj2r5

Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai
1 - 3 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
skill iconScala
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
Generative AI
+13 more

About Moative

Moative, an Applied AI Services company, designs AI roadmaps, builds co-pilots and predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries. Through Moative Labs, we aspire to build micro-products and launch AI startups in vertical markets.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Work you’ll do

As an AI Engineer at Moative, you will be at the forefront of applying cutting-edge AI to solve real-world problems. You will be instrumental in designing and developing intelligent software solutions, leveraging the power of foundation models to automate and optimize critical workflows. Collaborating closely with domain experts, data scientists, and ML engineers, you will integrate advanced ML and AI technologies into both existing and new systems. This role offers a unique opportunity to explore innovative ideas, experiment with the latest foundation models, and build impactful products that directly enhance the lives of citizens by transforming how government services are delivered. You'll be working on challenging and impactful projects that move the needle on traditionally difficult-to-automate processes.


Responsibilities

  • Utilize and adapt foundation models, particularly in vision and data extraction, as the core building blocks for developing impactful products aimed at improving government service delivery. This includes prompt engineering, fine-tuning, and evaluating model performance
  • Architect, build, and deploy intelligent AI agent-driven workflows that automate and optimize key processes within government service delivery. This encompasses the full lifecycle from conceptualization and design to implementation and monitoring
  • Contribute directly to enhancing our model evaluation and monitoring methodologies to ensure robust and reliable system performance. Proactively identify areas for improvement and implement solutions to optimize model accuracy and efficiency
  • Continuously learn and adapt to the rapidly evolving landscape of AI and foundation models, exploring new techniques and technologies to enhance our capabilities and solutions


Who you are

You are a passionate and results-oriented engineer who is driven by the potential of AI/ML to revolutionize processes, enhance products, and ultimately improve user experiences. You thrive in dynamic environments and are comfortable navigating ambiguity. You possess a strong sense of ownership and are eager to take initiative, advocating for your technical decisions while remaining open to feedback and collaboration. 


You are adept at working with real-world, often imperfect data, and have a proven ability to develop, refine, and deploy AI/ML models into production in a cost-effective and scalable manner. You are excited by the prospect of directly impacting government services and making a positive difference in the lives of citizens


Skills & Requirements

  • 3+ years of experience in programming languages such as Python or Scala
  • Proficient knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization, DevOps (Docker, Kubernetes)
  • Tuning and deploying foundation models, particularly for vision tasks and data extraction
  • Excellent analytical and problem-solving skills with the ability to break down complex challenges into actionable steps
  • Strong written and verbal communication skills, with the ability to effectively articulate technical concepts to both technical and non-technical audiences


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less.


Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that out loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply. We encourage you to apply even if you believe you do not meet all the requirements listed above.  


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers. 


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Mumbai, Pune, Noida
4 - 6 yrs
₹3L - ₹21L / yr
AWS Data Engineer
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
databricks
+1 more

 Key Responsibilities

  • Design and implement ETL/ELT pipelines using Databricks, PySpark, and AWS Glue
  • Develop and maintain scalable data architectures on AWS (S3, EMR, Lambda, Redshift, RDS)
  • Perform data wrangling, cleansing, and transformation using Python and SQL
  • Collaborate with data scientists to integrate Generative AI models into analytics workflows
  • Build dashboards and reports to visualize insights using tools like Power BI or Tableau
  • Ensure data quality, governance, and security across all data assets
  • Optimize performance of data pipelines and troubleshoot bottlenecks
  • Work closely with stakeholders to understand data requirements and deliver actionable insights

🧪 Required Skills

Skill AreaTools & TechnologiesCloud PlatformsAWS (S3, Lambda, Glue, EMR, Redshift)Big DataDatabricks, Apache Spark, PySparkProgrammingPython, SQLData EngineeringETL/ELT, Data Lakes, Data WarehousingAnalyticsData Modeling, Visualization, BI ReportingGen AI IntegrationOpenAI, Hugging Face, LangChain (preferred)DevOps (Bonus)Git, Jenkins, Terraform, Docker

📚 Qualifications

  • Bachelor's or Master’s degree in Computer Science, Data Science, or related field
  • 3+ years of experience in data engineering or data analytics
  • Hands-on experience with Databricks, PySpark, and AWS
  • Familiarity with Generative AI tools and frameworks is a strong plus
  • Strong problem-solving and communication skills

🌟 Preferred Traits

  • Analytical mindset with attention to detail
  • Passion for data and emerging technologies
  • Ability to work independently and in cross-functional teams
  • Eagerness to learn and adapt in a fast-paced environment


Read more
datamark bpo service Pvt Ltd
Chennai
9 - 16 yrs
₹16L - ₹22L / yr
skill iconC#
SQL
DevOps
skill iconReact.js
ASP.NET MVC
+15 more

Technical Lead

The ideal candidate should possess the following qualifications:

  • Education: Bachelor's degree in Computer Science, Software Engineering, or a related field.
  • Experience: 9+ years in software development with a proven track record of delivering scalable applications.
  • Leadership Skills: 4+ years of experience in a technical leadership role, demonstrating strong mentoring abilities.
  • Technical Lead must Lead and mentor a team of software developers and validation engineers.
  • Technical Skills: Technical Lead must have Proficiency in programming languages such as C#, React js, SQL, MySQL, Javascript, Web API are required .NET, or Python, along with frameworks and tools used in software development.
  • Technical Lead must have General working knowledge of Selenium to support current business automation tools and future automation requirements.
  • General working knowledge of PHP desired to support current legacy applications which are on the roadmap for future modernization.
  • Technical Lead must have Strong understanding of software development lifecycle (SDLC).
  • Experience with agile methodologies (Scrum/Kanban or similar).
  • Knowledge of version control systems (Git or similar).
  • Development Methodologies: Experience with Agile development methodologies and experience with CI/CD pipelines.
  • Problem-Solving Skills: Strong analytical and problem-solving abilities that enable the identification of complex technical issues.
  • Collaboration: Excellent communication and collaboration skills, with the ability to work effectively within a team environment.
  • Innovation: A passion for technology and innovation, with a keen interest in exploring new technologies to find the best solutions.


Read more
Intain Technologies

at Intain Technologies

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Chennai
3 - 7 yrs
Upto ₹16L / yr (Varies
)
skill iconPython
Artificial Intelligence (AI)
skill iconReact.js
skill iconAngular (2+)
skill iconMongoDB
+1 more

What You’ll Do

  • Build & tune models: embeddings, transformers, retrieval pipelines, evaluation frameworks.
  • Architect Python services (FastAPI/Flask) to embed ML/LLM workflows end-to-end.
  • Translate AI research into production features for data extraction, document reasoning, and risk analytics.
  • Own the full user flow: back-end → front-end (React/TS) → CI/CD on Azure & Docker.
  • Leverage AI coding tools (Copilot, Cursor, Jules) to meet our 1 dev = 4 devs productivity bar.


Core Tech Stack:

  • Primary:

Python · FastAPI/Flask · Pandas · SQL/NoSQL · Hugging Face · LangChain/RAG · REST/GraphQL · Azure · Docker

  • Bonus:

React.js · Vector Databases · Kubernetes


You Bring:

  • Proven track record shipping Python features and training/serving ML or LLM models.
  • Comfort reading research papers/blogs, prototyping ideas, and measuring model performance.
  • 360° product mindset: tests, reviews, secure code, quick iterations.
  • Strong ownership and output focus — impact beats years of experience.

Why Join Intain?

  • Small, expert team where your code and models hit production fast.
  • Work on real-world AI problems powering billions in structured-finance transactions.
  • Compensation & ESOPs tied directly to the value you ship.

📍 About Us

Intain is transforming structured finance using AI — from data ingestion to risk analytics. Our platform, powered by IntainAI and Ida, helps institutions manage and scale transactions seamlessly.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Chennai
5 - 8 yrs
₹20L - ₹30L / yr
skill iconPython
skill iconJava
Basic Qualifications : ● Experience: 4+ years. ●...
Immediate joiner

Basic Qualifications :

● Experience: 4+ years.

● Hands-on development experience with a broad mix of languages such as JAVA, Python, JavaScript, etc.

● Server-side development experience mainly in JAVA, (Python and NodeJS can be considerable)

● UI development experience in ReactJS or AngularJS or PolymerJS or EmberJS, or jQuery, etc., is good to have.

● Passion for software engineering and following the best coding concepts.

● Good to great problem solving and communication skills.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Chennai
4 - 8 yrs
₹12L - ₹20L / yr
skill iconPython
skill iconJava
Basic Qualifications : ● Experience: 4+ years. ●...
Immediate joiner

Basic Qualifications :

● Experience: 4+ years.

● Hands-on development experience with a broad mix of languages such as JAVA, Python, JavaScript, etc.

● Server-side development experience mainly in JAVA, (Python and NodeJS can be considerable)

● UI development experience in ReactJS or AngularJS or PolymerJS or EmberJS, or jQuery, etc., is good to have.

● Passion for software engineering and following the best coding concepts.

● Good to great problem solving and communication skills.

 

Nice to have Qualifications :

● Product and customer-centric mindset.

● Great OO skills, including design patterns.

● Experience with devops, continuous integration & deployment.

● Exposure to big data technologies, Machine Learning and NLP will be a plus.

Read more
Umanist India
Chennai
7 - 8 yrs
₹21L - ₹22L / yr
Google Cloud Platform (GCP)
skill iconMachine Learning (ML)
skill iconPython

Job Title: Software Engineer Consultant/Expert 34192 

Location: Chennai

Work Type: Onsite

Notice Period: Immediate Joiners only or serving candidates upto 30 days.

 

Position Description:

  • Candidate with strong Python experience.
  • Full Stack Development in GCP End to End Deployment/ ML Ops Software Engineer with hands-on n both front end, back end and ML Ops
  • This is a Tech Anchor role.

Experience Required:

  • 7 Plus Years
Read more
Umanist India
Prince Tiwari
Posted by Prince Tiwari
Chennai
5 - 6 yrs
₹20L - ₹21L / yr
skill iconAngularJS (1.x)
skill iconReact.js
skill iconPython
skill iconJava
skill iconSpring Boot

Key Responsibilities: 34249 

  • Feature Development: Design, develop, and maintain new features and enhancements across the stack.
  • Front-End: Build intuitive, responsive UIs using Angular or React.
  • Back-End: Develop scalable APIs and services using Python (preferred), Java/Spring, or Node.js.
  • Cloud Deployment: Deploy and manage applications on Google Cloud Platform (GCP) — familiarity with services like App Engine, Cloud Functions, Kubernetes is expected.
  • Performance Tuning: Identify and optimize performance bottlenecks.
  • Code Quality: Participate in code reviews and maintain high standards through unit testing and automation.
  • DevOps & CI/CD: Collaborate on deployment pipelines using Tekton, Terraform, and other DevOps tools.
  • Cross-Functional Collaboration: Work closely with Product Managers, UI/UX Designers, and fellow Engineers in an agile environment.

Must-Have Skills:

  • Strong development expertise in Python (preferred), Angular, and GCP
  • Understanding of DevOps practices
  • Experience with SDLC, agile methodologies, and unit testing

Good to Have (Nice-to-Haves):

  • Hands-on experience with:
  • Tekton, Terraform, CI/CD pipelines
  • Large Language Models (LLMs) integration
  • AWS/Azure (in addition to GCP)
  • Contributions to open-source projects
  • Familiarity with API design and microservices architecture

Educational Qualification:

  • Required: Bachelor’s Degree in Computer Science, Engineering, or related discipline




Read more
Deqode

at Deqode

1 recruiter
Sneha Jain
Posted by Sneha Jain
Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Indore, Jaipur, Kolkata, Chennai, Bengaluru (Bangalore)
3.5 - 7 yrs
₹8L - ₹13L / yr
AWS Lambda
skill iconPython
Microservices
Amazon EC2

We are seeking a highly skilled and motivated Python Developer with hands-on experience in AWS cloud services (Lambda, API Gateway, EC2), microservices architecture, PostgreSQL, and Docker. The ideal candidate will be responsible for designing, developing, deploying, and maintaining scalable backend services and APIs, with a strong emphasis on cloud-native solutions and containerized environments.


Key Responsibilities:

  • Develop and maintain scalable backend services using Python (Flask, FastAPI, or Django).
  • Design and deploy serverless applications using AWS Lambda and API Gateway.
  • Build and manage RESTful APIs and microservices.
  • Implement CI/CD pipelines for efficient and secure deployments.
  • Work with Docker to containerize applications and manage container lifecycles.
  • Develop and manage infrastructure on AWS (including EC2, IAM, S3, and other related services).
  • Design efficient database schemas and write optimized SQL queries for PostgreSQL.
  • Collaborate with DevOps, front-end developers, and product managers for end-to-end delivery.
  • Write unit, integration, and performance tests to ensure code reliability and robustness.
  • Monitor, troubleshoot, and optimize application performance in production environments.


Required Skills:

  • Strong proficiency in Python and Python-based web frameworks.
  • Experience with AWS services: Lambda, API Gateway, EC2, S3, CloudWatch.
  • Sound knowledge of microservices architecture and asynchronous programming.
  • Proficiency with PostgreSQL, including schema design and query optimization.
  • Hands-on experience with Docker and containerized deployments.
  • Understanding of CI/CD practices and tools like GitHub Actions, Jenkins, or CodePipeline.
  • Familiarity with API documentation tools (Swagger/OpenAPI).
  • Version control with Git.


Read more
Greatify
Ciline Sanjanyaa
Posted by Ciline Sanjanyaa
Chennai
2 - 5 yrs
₹4L - ₹10L / yr
Playwright
skill iconPython

ABOUT THE JOB: 

Job Title: QA Automation Specialist 

Location: Teynampet, Chennai 

Job Type: Full-time 

Company: Gigadesk Technologies Pvt. Ltd. [Greatify.ai] 


COMPANY DESCRIPTION:

At Greatify.ai, we are transforming educational institutions with cutting-edge AI-powered solutions. Our platform acts as a smart operating system for colleges, schools, and universities—enhancing learning, streamlining operations, and maximizing efficiency.

With 100+ institutions served, 100,000+ students impacted globally, and 1,000+ educators empowered, we are redefining the future of education. 


COMPANY WEBSITE: https://www.greatify.ai/


JOB DESCRIPTION:

As a QA Automation Specialist at Greatify, you will be responsible for designing, building, and maintaining robust automated test frameworks and suites covering UI, API, integration, regression, and performance tests for our ed‑tech platforms. As part of an Agile, cross‑functional team, you’ll integrate automation into our CI/CD pipelines to speed up release cycles while ensuring high product quality and reliability. Your role ensures consistent quality, provides actionable insights, and champions automation best practices across the QA function. 


KEY RESPONSIBILITIES:


1. Quality Assurance Strategy:

  • Develop and own QA strategy for EdTech product suites.
  • Work with Product and Engineering teams to define quality benchmarks and release criteria.
  • Ensure quality is embedded early in the software development lifecycle.

2. Test Planning & Execution:

  • Design, write, and execute test cases and scenarios—manual and automated.
  • Manage regression, integration, and exploratory testing.
  • Monitor test outcomes, identify risks, and mitigate issues.

3. Automation Framework Development

  • Develop scalable, maintainable automation frameworks using Playwright and Selenium, structured with Cucumber (BDD) for readable test specifications.
  • Write automation scripts in Python and Java, following best practices like modular design and Page Object Model

4. Bug Tracking and Reporting:

  • Log, triage, and track bugs using tools like Jira.
  • Generate insightful quality reports for stakeholders.

5. Usability and Functional Testing:

  • Evaluate UX across web/mobile platforms.
  • Support UX teams with accessibility and user satisfaction testing.

6. Collaboration and Mentoring:

  • Foster a strong QA culture with best practices and collaboration.


QUALIFICATIONS:

  1. Bachelor’s degree in Computer Science, Information Technology, or related field.
  2. 2+ years of QA experience with at least 2 years in automation testing.
  3. Proficiency in writing automation scripts using mainstream tools
  4. Experience in education tech systems.
  5. Hands-on knowledge of Agile/Scrum processes.
  6. Familiarity with programming languages Python and Java, using Playwright and Selenium for automation scripting, and employing JMeter or k6 with Grafana for performance testing.
  7. Experience setting up CI/CD pipelines via GitHub Actions and Jenkins, and managing test cases and execution tracking in ClickUp
  8. Experience with cross-browser and mobile automation is a plus.
  9. Strong problem-solving skills and attention to detail.
  10. Excellent communication and team collaboration skills.


Read more
Greatify
Ciline Sanjanyaa
Posted by Ciline Sanjanyaa
Chennai
2 - 5 yrs
₹3L - ₹9L / yr
Playwright
Selenium
skill iconPython
skill iconJava
Automation
+4 more

ABOUT THE JOB:

Job Title: QA Automation Specialist

Location: Teynampet, Chennai

Job Type: Full-time

Company: Gigadesk Technologies Pvt. Ltd. [Greatify.ai]


COMPANY DESCRIPTION:

At Greatify.ai, we are transforming educational institutions with cutting-edge AI-powered solutions. Our platform acts as a smart operating system for colleges, schools, and universities—enhancing learning, streamlining operations, and maximizing efficiency.

With 100+ institutions served, 100,000+ students impacted globally, and 1,000+ educators empowered, we are redefining the future of education.


COMPANY WEBSITE: https://www.greatify.ai/


JOB DESCRIPTION: 

As a QA Automation Specialist at Greatify, you will be responsible for designing, building, and maintaining robust automated test frameworks and suites covering UI, API, integration, regression, and performance tests for our ed‑tech platforms. As part of an Agile, cross‑functional team, you’ll integrate automation into our CI/CD pipelines to speed up release cycles while ensuring high product quality and reliability. Your role ensures consistent quality, provides actionable insights, and champions automation best practices across the QA function.


KEY RESPONSIBILITIES:

1.Quality Assurance Strategy:

  • Develop and own QA strategy for EdTech product suites.
  • Work with Product and Engineering teams to define quality benchmarks and release criteria.
  • Ensure quality is embedded early in the software development lifecycle.

2.Test Planning & Execution:

  • Design, write, and execute test cases and scenarios—manual and automated.
  • Manage regression, integration, and exploratory testing.
  • Monitor test outcomes, identify risks, and mitigate issues.

3.Automation Framework Development

  • Develop scalable, maintainable automation frameworks using Playwright and Selenium, structured with Cucumber (BDD) for readable test specifications.
  • Write automation scripts in Python and Java, following best practices like modular design and Page Object Model

4.Bug Tracking and Reporting:

  • Log, triage, and track bugs using tools like Jira.
  • Generate insightful quality reports for stakeholders.

5.Usability and Functional Testing:

  • Evaluate UX across web/mobile platforms.
  • Support UX teams with accessibility and user satisfaction testing.

6.Collaboration and Mentoring:

  • Foster a strong QA culture with best practices and collaboration. 


QUALIFICATIONS:

1. Bachelor’s degree in computer science, Information Technology, or related field.

2. 2+ years of QA experience with at least 2 years in automation testing. 3. Proficiency in writing automation scripts using mainstream tools

4. Experience in education tech systems.

5. Hands-on knowledge of Agile/Scrum processes.

6. Familiarity with programming languages Python and Java, using Playwright and Selenium for automation scripting, and employing JMeter or k6 with Grafana for performance testing.

7. Experience setting up CI/CD pipelines via GitHub Actions and Jenkins and managing test cases and execution tracking in Click Up

8. Experience with cross-browser and mobile automation is a plus.

9. Strong problem-solving skills and attention to detail.

10. Excellent communication and team collaboration skills. 

Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai
3 - 5 yrs
₹10L - ₹25L / yr
skill iconPython
NumPy
pandas
Scikit-Learn
Natural Language Toolkit (NLTK)
+4 more

About Moative

Moative, an Applied AI company, designs and builds transformation AI solutions for traditional industries in energy, utilities, healthcare & lifesciences, and more. Through Moative Labs, we build AI micro-products and launch AI startups with partners in vertical markets that align with our theses.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Our Team: Our team of 20+ employees consist of data scientists, AI/ML Engineers, and mathematicians from top engineering and research institutes such as IITs, CERN, IISc, UZH, Ph.Ds. Our team includes academicians, IBM Research Fellows, and former founders.


Work you’ll do

As a Data Scientist at Moative, you’ll play a crucial role in extracting valuable insights from data to drive informed decision-making. You’ll work closely with cross-functional teams to build predictive models and develop solutions to complex business problems. You will also be involved in conducting experiments, building POCs and prototypes.


Responsibilities

  • Support end-to-end development and deployment of ML/ AI models - from data preparation, data analysis and feature engineering to model development, validation and deployment
  • Gather, prepare and analyze data, write code to develop and validate models, and continuously monitor and update them as needed.
  • Collaborate with domain experts, engineers, and stakeholders in translating business problems into data-driven solutions
  • Document methodologies and results, present findings and communicate insights to non-technical audiences


Skills & Requirements

  • Proficiency in Python and familiarity with basic Python libraries for data analysis and ML algorithms (such as NumPy, Pandas, ScikitLearn, NLTK). 
  • Strong understanding and experience with data analysis, statistical and mathematical concepts and ML algorithms 
  • Working knowledge of cloud platforms (e.g., AWS, Azure, GCP).
  • Broad understanding of data structures and data engineering.
  • Strong communication skills
  • Strong collaboration skills, continuous learning attitude and a problem solving mind-set


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less. Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply here. We encourage you to apply even if you believe you do not meet all the requirements listed above.  


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers. 


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to be present in the city. We intend to move to a hybrid model in a few months time.

Read more
Us healthcare company

Us healthcare company

Agency job
via People Impact by Ranjita Shrivastava
Hyderabad, Chennai
4 - 8 yrs
₹20L - ₹30L / yr
ai/ml
TensorFlow
skill iconPython
Google Cloud Platform (GCP)
Vertex

·                    Design, develop, and implement AI/ML models and algorithms.

·                    Focus on building Proof of Concept (POC) applications to demonstrate the feasibility and value of AI solutions.

·                    Write clean, efficient, and well-documented code.

·                    Collaborate with data engineers to ensure data quality and availability for model training and evaluation.

·                    Work closely with senior team members to understand project requirements and contribute to technical solutions.

·                    Troubleshoot and debug AI/ML models and applications.

·                    Stay up-to-date with the latest advancements in AI/ML.

·                    Utilize machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) to develop and deploy models.

·                    Develop and deploy AI solutions on Google Cloud Platform (GCP).

·                    Implement data preprocessing and feature engineering techniques using libraries like Pandas and NumPy.

·                    Utilize Vertex AI for model training, deployment, and management.

·                    Integrate and leverage Google Gemini for specific AI functionalities.

Qualifications:

·                    Bachelor’s degree in computer science, Artificial Intelligence, or a related field.

·                    3+ years of experience in developing and implementing AI/ML models.

·                    Strong programming skills in Python.

·                    Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.

·                    Good understanding of machine learning concepts and techniques.

·                    Ability to work independently and as part of a team.

·                    Strong problem-solving skills.

·                    Good communication skills.

·                    Experience with Google Cloud Platform (GCP) is preferred.

·                    Familiarity with Vertex AI is a plus.


Read more
Klenty

at Klenty

2 recruiters
Klenty Ramya
Posted by Klenty Ramya
Chennai
3 - 5 yrs
₹10L - ₹16L / yr
skill iconMongoDB
skill iconExpress
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
+5 more
  • Work with a team to provide end to end solutions including coding, unit testing and defect fixes.
  • Work to build scalable solutions and work with quality assurance and control teams to analyze and fix issues 
  • Develop and maintain APIs and Services in Node.js/Python 
  • Develop and maintain web-based UI’s using front-end frameworks 
  • Participate in code reviews, unit testing and integration testing 
  • Participate in the full software development lifecycle, from concept and design to implementation and support 
  • Ensure application performance, scalability, and security through best practices in coding, testing and deployment 
  • Collaborate with DevOps team for troubleshooting deployment issues 

 

Qualification 

● 1-5 years of experience as a Software Engineer or similar, focusing on software development and system integration 

● Proficiency in Node.js, Typescript, React, Express framework 

● In-depth knowledge of databases such as MongoDB 

● Proficient in HTML5, CSS3, and responsive UI design 

● Proficiency in any Python development framework is a plus 

● Strong direct experience in functional and object oriented programming using Javascript 

● Experience with cloud platforms (Azure preferred) 

● Microservices architecture and containerization 

● Expertise in performance monitoring, tuning, and optimization 

● Understanding of DevOps practices for automated deployments 

● Understanding of software design patterns and best practices 

● Practical experience working in Agile developments (scrum) 

● Excellent critical thinking skills and the ability to mentor junior team members 

● Effectively communicate and collaborate with cross-functional teams 

● Strong capability to work independently and deliver results within tight deadlines 

● Strong problem-solving abilities and attention to detail

Read more
Chennai based

Chennai based

Agency job
via Girmiti Software by Deric John
Chennai
5 - 6 yrs
₹7L - ₹14L / yr
skill iconGo Programming (Golang)
skill iconPython
skill iconJava

Proficient in Golang, Python, Java, C++, or Ruby (at least one)

Strong grasp of system design, data structures, and algorithms

Experience with RESTful APIs, relational and NoSQL databases

Proven ability to mentor developers and drive quality delivery

Track record of building high-performance, scalable systems

Excellent communication and problem-solving skills

Experience in consulting or contractor roles is a plus

Read more
HappyFox

at HappyFox

1 video
6 products
Sharon Samuel
Posted by Sharon Samuel
Chennai
2 - 5 yrs
₹9L - ₹15L / yr
Test Automation (QA)
Manual testing
skill iconPython
skill iconJavascript
skill iconJava

We're seeking a Software Development Engineer in Test (SDET) to ensure product feature quality through meticulous test design, automation, and result analysis. Collaborate closely with developers to optimize test coverage, resolve bugs, and streamline project delivery.


Responsibilities:

Ensure the quality of product feature development.

Test Design: Understand the necessary functionalities and implementation strategies for straightforward feature development. Inspect code changes, identify key test scenarios and impact areas, and create a thorough test plan.

Test Automation: Work with developers to build reusable test scripts. Review unit/functional test scripts, and aim to maximize test coverage to minimize manual testing, using Python.

Test Execution and Analysis: Monitor test results and identify areas lacking in test coverage. Address these areas by creating additional test scripts and deliver transparent test metrics to the team.

Support & Bug Fixes: Handle issues reported by customers and aid in bug resolution.

Collaboration: Participate in project planning and execution with the team for efficient project delivery.


Requirements:

A Bachelor's degree in computer science, IT, engineering, or a related field, with a genuine interest in software quality assurance, issue detection, and analysis.

2-5 years of solid experience in software testing, with a focus on automation. Proficiency in using a defect tracking system, Code repositories & IDEs.

A good grasp of programming languages like Python/Java/Javascript. Must be able to understand and write code.

Familiarity with testing frameworks (e.g., Selenium, Appium, JUnit).

Good team player with a proactive approach to continuous learning.

Sound understanding of the Agile software development methodology.

Experience in a SaaS-based product company or a fast-paced startup environment is a plus.

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai
4 - 8 yrs
₹7L - ₹26L / yr
SRE
Reliability engineering
skill iconAmazon Web Services (AWS)
skill iconPython

Job Title: Site Reliability Engineer (SRE)

Experience: 4+ Years

Work Location: Bangalore / Chennai / Pune / Gurgaon

Work Mode: Hybrid or Onsite (based on project need)

Domain Preference: Candidates with past experience working in shoe/footwear retail brands (e.g., Nike, Adidas, Puma) are highly preferred.


🛠️ Key Responsibilities

  • Design, implement, and manage scalable, reliable, and secure infrastructure on AWS.
  • Develop and maintain Python-based automation scripts for deployment, monitoring, and alerting.
  • Monitor system performance, uptime, and overall health using tools like Prometheus, Grafana, or Datadog.
  • Handle incident response, root cause analysis, and ensure proactive remediation of production issues.
  • Define and implement Service Level Objectives (SLOs) and Error Budgets in alignment with business requirements.
  • Build tools to improve system reliability, automate manual tasks, and enforce infrastructure consistency.
  • Collaborate with development and DevOps teams to ensure robust CI/CD pipelines and safe deployments.
  • Conduct chaos testing and participate in on-call rotations to maintain 24/7 application availability.


Must-Have Skills

  • 4+ years of experience in Site Reliability Engineering or DevOps with a focus on reliability, monitoring, and automation.
  • Strong programming skills in Python (mandatory).
  • Hands-on experience with AWS cloud services (EC2, S3, Lambda, ECS/EKS, CloudWatch, etc.).
  • Expertise in monitoring and alerting tools like Prometheus, Grafana, Datadog, CloudWatch, etc.
  • Strong background in Linux-based systems and shell scripting.
  • Experience implementing infrastructure as code using tools like Terraform or CloudFormation.
  • Deep understanding of incident management, SLOs/SLIs, and postmortem practices.
  • Prior working experience in footwear/retail brands such as Nike or similar is highly preferred.


Read more
Us healthcare company

Us healthcare company

Agency job
via People Impact by Ranjita Shrivastava
Hyderabad, Chennai
11 - 20 yrs
₹50L - ₹60L / yr
Generative AI
skill iconPython
TensorFlow
Google Cloud Platform (GCP)
POC

Job Title: AI Solutioning Architect – Healthcare IT

Role Summary:

The AI Solutioning Architect leads the design and implementation of AI-driven solutions across the organization, ensuring alignment with business goals and healthcare IT standards. This role defines the AI/ML architecture, guides technical execution, and fosters innovation using platforms like Google Cloud (GCP).

Key Responsibilities:

  • Architect scalable AI solutions from data ingestion to deployment.
  • Align AI initiatives with business objectives and regulatory requirements (HIPAA).
  • Collaborate with cross-functional teams to deliver AI projects.
  • Lead POCs, evaluate AI tools/platforms, and promote GCP adoption.
  • Mentor technical teams and ensure best practices in MLOps.
  • Communicate complex concepts to diverse stakeholders.

Qualifications:

  • Bachelor’s/Master’s in Computer Science or related field.
  • 12+ years in software development/architecture with strong AI/ML focus.
  • Experience in healthcare IT and compliance (HIPAA).
  • Proficient in Python/Java and ML frameworks (TensorFlow, PyTorch).
  • Hands-on with GCP (preferred) or other cloud platforms.
  • Strong leadership, problem-solving, and communication skills.


Read more
Hyderabad, Bengaluru (Bangalore), Mumbai, Delhi, Pune, Chennai
0 - 1 yrs
₹10L - ₹20L / yr
skill iconPython
Object Oriented Programming (OOPs)
skill iconJavascript
skill iconJava
Data Structures
+1 more


About NxtWave


NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.

Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.

Know more:

🌐 NxtWave | NIAT

About the Role

As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.


Key Responsibilities

  • Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
  • Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
  • Mentor students in academic, career, and project development goals.
  • Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
  • Drive research-led content development, and contribute to innovation in teaching methodologies.
  • Support capstone projects, hackathons, and collaborative research opportunities with industry.
  • Foster a high-performance learning environment in classes of 70–100 students.
  • Collaborate with cross-functional teams for continuous student development and program quality.
  • Actively participate in faculty training, peer reviews, and academic audits.


Eligibility & Requirements

  • Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
  • Strong academic and research orientation, preferably with publications or project contributions.
  • Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
  • A deep commitment to education, student success, and continuous improvement.

Must-Have Skills

  • Expertise in Python, Java, JavaScript, and advanced programming paradigms.
  • Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
  • Excellent communication, classroom delivery, and presentation skills.
  • Familiarity with academic content tools like Google Slides, Sheets, Docs.
  • Passion for educating, mentoring, and shaping future developers.

Good to Have

  • Industry experience or consulting background in software development or research-based roles.
  • Proficiency in version control systems (e.g., Git) and agile methodologies.
  • Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
  • A drive to innovate in teaching, curriculum design, and student engagement.

Why Join Us?

  • Be at the forefront of shaping India’s tech education revolution.
  • Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
  • Competitive compensation with strong growth potential.
  • Create impact at scale by mentoring hundreds of future-ready tech leaders.


Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
Genspark India

Genspark India

Agency job
via Genspark by S Priyadharshini
Chennai
1 - 3 yrs
₹3L - ₹6L / yr
Embedded C
skill iconC++
skill iconC
dbms
DSA
+2 more

Genspark is hiring Professionals for C Development for there Premium Client

Work Location- Chennai 

Entry Criteria 

Graduate from Any Engineering Background /BSc/MSc /MCA with  specialization(Computer/Electronics/IT ) 

Minimum 1 year experience in Industry 

 Working Knowledge of C/Embedded/C++/DSA 

Programming Aptitude (Any Language) 

Basic understanding of programming constructs: variables, loops, conditionals, functions 

Logical thinking and algorithmic approach 

Computer Science Fundamentals: 

Data structures basics: arrays, stacks, queues, linked lists 

Operating System basics: what is a process/thread, memory, file system, etc. 

Basic understanding of compilation, runtime, networking and sockets etc. 

Problem Solving & Logical Reasoning 

Ability to trace logic, find errors, and reason through pseudocode 

Analytical and debugging capabilities 

Learning Attitude & Communication 

Demonstrated interest in low-level or systems programming (even if no experience) 

Willingness to learn C and work close to the OS level 

Clarity of thought and ability to explain what they do know 

Soft Skills : 

Able to explain and communicate the thoughts clearly in English 

Confident in solving new problems independently or with guidance 

Willingness to take feedback and iterate 

Evaluation Process 

Candidates will be assigned an online test,  followed by Technical Screening. 

Shortlisted Candidates will have to appear for a F2F Interview with the Client, Chennai. 

 

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Chennai
8 - 12 yrs
₹10L - ₹26L / yr
skill iconPython
skill iconMachine Learning (ML)
Scikit-Learn
TensorFlow
PyTorch
+10 more

Job Title : Senior Machine Learning Engineer

Experience : 8+ Years

Location : Chennai

Notice Period : Immediate Joiners Only

Work Mode : Hybrid


Job Summary :

We are seeking an experienced Machine Learning Engineer with a strong background in Python, ML algorithms, and data-driven development.

The ideal candidate should have hands-on experience with popular ML frameworks and tools, solid understanding of clustering and classification techniques, and be comfortable working in Unix-based environments with Agile teams.


Mandatory Skills :

  • Programming Languages : Python
  • Machine Learning : Strong experience with ML algorithms, models, and libraries such as Scikit-learn, TensorFlow, and PyTorch
  • ML Concepts : Proficiency in supervised and unsupervised learning, including techniques such as K-Means, DBSCAN, and Fuzzy Clustering
  • Operating Systems : RHEL or any Unix-based OS
  • Databases : Oracle or any relational database
  • Version Control : Git
  • Development Methodologies : Agile

Desired Skills :

  • Experience with issue tracking tools such as Azure DevOps or JIRA.
  • Understanding of data science concepts.
  • Familiarity with Big Data algorithms, models, and libraries.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Bengaluru (Bangalore), Gurugram, Chennai, Mumbai
5 - 7 yrs
₹6L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
AWS Glue
skill iconPython
PySpark

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Bengaluru (Bangalore), Pune, Mumbai, Chennai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
aws
Amazon Redshift
+1 more

Position: AWS Data Engineer

Experience: 5 to 7 Years

Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram

Work Mode: Hybrid (3 days work from office per week)

Employment Type: Full-time

About the Role:

We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
  • Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
  • Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
  • Optimize data models and storage for cost-efficiency and performance.
  • Write advanced SQL queries to support complex data analysis and reporting requirements.
  • Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
  • Ensure high data quality and integrity across platforms and processes.
  • Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.

Required Skills & Experience:

  • Strong hands-on experience with Python or PySpark for data processing.
  • Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
  • Proficiency in writing complex SQL queries and optimizing them for performance.
  • Familiarity with serverless architectures and AWS best practices.
  • Experience in designing and maintaining robust data architectures and data lakes.
  • Ability to troubleshoot and resolve data pipeline issues efficiently.
  • Strong communication and stakeholder management skills.


Read more
Bengaluru and chennai based tech startup

Bengaluru and chennai based tech startup

Agency job
via Recruit Square by Priyanka choudhary
Bengaluru (Bangalore), Chennai
6 - 12 yrs
₹19L - ₹35L / yr
Linux/Unix
TCP/IP
Windows Azure
skill iconAmazon Web Services (AWS)
SaaS
+2 more

Has substantial expertise in Linux OS, Https, Proxy knowledge, Perl, Python scripting & hands-on

Is responsible for the identification and selection of appropriate network solutions to design and deploy in environments based on business objectives and requirements.

Is skilled in developing, deploying, and troubleshooting network deployments, with deep technical knowledge, especially around Bootstrapping & Squid Proxy, Https, scripting equivalent knowledge. Further align the network to meet the Company’s objectives through continuous developments, improvements and automation.

Preferably 10+ years of experience in network design and delivery of technology centric, customer-focused services.

Preferably 3+ years in modern software-defined network and preferably, in cloud-based environments.

Diploma or bachelor’s degree in engineering, Computer Science/Information Technology, or its equivalent.

Preferably possess a valid RHCE (Red Hat Certified Engineer) certification

Preferably possess any vendor Proxy certification (Forcepoint/ Websense/ bluecoat / equivalent)

Must possess advanced knowledge in TCP/IP concepts and fundamentals.  Good understanding and working knowledge of Squid proxy, Https protocol / Certificate management.

Fundamental understanding of proxy & PAC file.

Integration experience and knowledge between modern networks and cloud service providers such as AWS, Azure and GCP will be advantageous.

Knowledge in SaaS, IaaS, PaaS, and virtualization will be advantageous.

Coding skills such as Perl, Python, Shell scripting will be advantageous.

Excellent technical knowledge, troubleshooting, problem analysis, and outside-the-box thinking.

Excellent communication skills – oral, written and presentation, across various types of target audiences.

Strong sense of personal ownership and responsibility in accomplishing the organization’s goals and objectives. Exudes confidence, able to cope under pressure and will roll-up his/her sleeves to drive a project to success in a challenging environment.

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Linarc Inc

at Linarc Inc

3 recruiters
jhansi peter
Posted by jhansi peter
Chennai
4 - 9 yrs
₹15L - ₹35L / yr
skill iconPython
skill iconDjango

What We’re Looking For

  • 4+ years of backend development experience in scalable web applications.
  • Strong expertise in Python, Django ORM, and RESTful API design.
  • Familiarity with relational databases like PostgreSQL and MySQL databases
  • Comfortable working in a startup environment with multiple priorities.
  • Understanding of cloud-native architectures and SaaS models.
  • Strong ownership mindset and ability to work with minimal supervision.
  • Excellent communication and teamwork skills.


Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
NeoGenCode Technologies Pvt Ltd
Bengaluru (Bangalore), Pune, Chennai
3 - 6 yrs
₹2L - ₹12L / yr
Test Automation (QA)
Automation
Software Testing (QA)
Generative AI
Selenium
+7 more

Job Title : Automation Quality Engineer (Gen AI)

Experience : 3 to 5+ Years

Location : Bangalore / Chennai / Pune


Role Overview :

We’re hiring a Quality Engineer to lead QA efforts for AI models, applications, and infrastructure.

You'll collaborate with cross-functional teams to design test strategies, implement automation, ensure model accuracy, and maintain high product quality.


Key Responsibilities :

  • Develop and maintain test strategies for AI models, APIs, and user interfaces.
  • Build automation frameworks and integrate into CI/CD pipelines.
  • Validate model accuracy, robustness, and monitor model drift.
  • Perform regression, performance, load, and security testing.
  • Log and track issues; collaborate with developers to resolve them.
  • Ensure compliance with data privacy and ethical AI standards.
  • Document QA processes and testing outcomes.

Mandatory Skills :

  • Test Automation : Selenium, Playwright, or Deep Eval
  • Programming/Scripting : Python, JavaScript
  • API Testing : Postman, REST Assured
  • Cloud & DevOps : Azure, Azure Kubernetes, CI/CD pipelines
  • Performance Testing : JMeter
  • Bug Tracking : Azure DevOps
  • Methodologies : Agile delivery experience
  • Soft Skills : Strong communication and problem-solving abilities
Read more
BigRio
Disha Bhardwaj
Posted by Disha Bhardwaj
Chennai
8 - 12 yrs
₹25L - ₹30L / yr
Natural Language Processing (NLP)
Large Language Models (LLM) tuning
Artificial Intelligence (AI)
skill iconPython

Job Title: AI Engineer - NLP/LLM Data Product Engineer Location: Chennai, TN- Hybrid

Duration: Full time


Job Summary:

About the Role:

We are growing our Data Science and Data Engineering team and are looking for an

experienced AI Engineer specializing in creating GenAI LLM solutions. This position involves collaborating with clients and their teams, discovering gaps for automation using AI, designing customized AI solutions, and implementing technologies to streamline data entry processes within the healthcare sector.


Responsibilities:

·        Conduct detailed consultations with clients functional teams to understand client requirements, one use case is related to handwritten medical records.

·        Analyze existing data entry workflows and propose automation opportunities.

Design:

·        Design tailored AI-driven solutions for the extraction and digitization of information from handwritten medical records.

·        Collaborate with clients to define project scopes and objectives.

Technology Selection:

·        Evaluate and recommend AI technologies, focusing on NLP, LLM and machine learning.

·        Ensure seamless integration with existing systems and workflows.

Prototyping and Testing:

·        Develop prototypes and proof-of-concept models to demonstrate the feasibility of proposed solutions.

·        Conduct rigorous testing to ensure accuracy and reliability.

Implementation and Integration:

·        Work closely with clients and IT teams to integrate AI solutions effectively.

·        Provide technical support during the implementation phase.

Training and Documentation:

·        Develop training materials for end-users and support staff.

·        Create comprehensive documentation for implemented solutions.

Continuous Improvement:


·        Monitor and optimize the performance of deployed solutions.

·        Identify opportunities for further automation and improvement.

Qualifications:

·        Advanced degree in Computer Science, Artificial Intelligence, or related field (Masters or PhD required).

·        Proven experience in developing and implementing AI solutions for data entry automation.

·        Expertise in NLP, LLM and other machine-learning techniques.

·        Strong programming skills, especially in Python.

·        Familiarity with healthcare data privacy and regulatory requirements.


Additional Qualifications( great to have):

An ideal candidate will have expertise in the most current LLM/NLP models, particularly in the extraction of data from clinical reports, lab reports, and radiology reports. The ideal candidate should have a deep understanding of EMR/EHR applications and patient-related data.

Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Chennai, Bhopal, Jaipur
10 - 15 yrs
₹30L - ₹40L / yr
Spark
Google Cloud Platform (GCP)
skill iconPython
Apache Airflow
PySpark
+1 more

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.


  • Shift: 2 PM 11 PM
  • Work Mode: Hybrid (3 days a week) across Xebia locations
  • Notice Period: Immediate joiners or those with a notice period of up to 30 days


Key Responsibilities:

  • Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
  • Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
  • Ensure data integrity, consistency, and availability across all systems.
  • Collaborate with data engineers, analysts, and stakeholders to optimize performance.
  • Document standards and best practices for data engineering workflows.

Required Experience:


  • 7-8 years of experience in data engineering, architecture, and pipeline development.
  • Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
  • Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
  • Understanding of Data Lake table formats (Delta, Iceberg, etc.).
  • Proficiency in Python for scripting and automation.
  • Strong problem-solving skills and collaborative mindset.


⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!


Best regards,

Vijay S

Assistant Manager - TAG

https://www.linkedin.com/in/vijay-selvarajan/

Read more
OnActive
Mansi Gupta
Posted by Mansi Gupta
Gurugram, Pune, Bengaluru (Bangalore), Chennai, Bhopal, Hyderabad, Jaipur
5 - 8 yrs
₹6L - ₹12L / yr
skill iconPython
Spark
SQL
AWS CloudFormation
skill iconMachine Learning (ML)
+3 more

Level of skills and experience:


5 years of hands-on experience in using Python, Spark,Sql.

Experienced in AWS Cloud usage and management.

Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow).

Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch.

Experience with orchestrators such as Airflow and Kubeflow.

Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).

Fundamental understanding of Parquet, Delta Lake and other data file formats.

Proficiency on an IaC tool such as Terraform, CDK or CloudFormation.

Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst

Read more
All time design
Prem kumar
Posted by Prem kumar
Chennai
1 - 3 yrs
₹2L - ₹4L / yr
skill iconFlask
skill iconPython
skill iconMongoDB
RESTful APIs
Payment gateways

Job description


We are looking for an experienced Python developer to join our engineering team and help us create dynamic software applications for our clients. In this role, you will be responsible for writing and testing scalable code, developing back-end components, and integrating user-facing elements in collaboration with front-end developers.


Responsibilities:


  • Coordinating with development teams to determine application requirements.


  • Writing scalable code using Python programming language.


  • Testing and debugging applications.


  • Developing back-end components.


  • Integrating user-facing elements using server-side logic.


  • Assessing and prioritizing client feature requests.


  • Integrating data storage solutions.


  • Coordinating with front-end developers.


  • Reprogramming existing databases to improve functionality.


  • Developing digital tools to monitor online traffic.


Requirements:


  • Bachelor's degree in Computer Science, Computer Engineering, or related field.


  • 2-7 years of experience as a Python Developer.


  • Expert knowledge of Python and Flask framework and Fast API.


  • Solid experience in MongoDB, Elastic Search.


  • Work experience in Restful API


  • A deep understanding and multi-process architecture and the threading limitations of Python.


  • Ability to integrate multiple data sources into a single system.


  • Familiarity with testing tools.


  • Ability to collaborate on projects and work independently when required.


  • Excellent troubleshooting skills.


  • Good project management skills.


SKILLS:


  • PHYTHON
  • MONGODB
  • FLASK
  • REST API DEVELOPMENT
  • TWILIO


Job Type: Full-time


Pay: ₹10,000.00 - ₹30,000.00 per month


Benefits:

  • Flexible schedule
  • Paid time off


Schedule:

  • Day shift


Supplemental Pay:

  • Overtime pay


Ability to commute/relocate:

  • Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required)


Experience:

  • Python: 1 year (Required)


Work Location: In person

Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Pune, Hyderabad, Chennai, Gurugram, Bhopal, Jaipur
5 - 15 yrs
₹20L - ₹35L / yr
Spark
ETL
Data Transformation Tool (DBT)
skill iconPython
Apache Airflow
+2 more

We are seeking a highly skilled and experienced Offshore Data Engineer . The role involves designing, implementing, and testing data pipelines and products.


Qualifications & Experience:


bachelor's or master's degree in computer science, Information Systems, or a related field.


5+ years of experience in data engineering, with expertise in data architecture and pipeline development.


☁️ Proven experience with GCP, Big Query, Databricks, Airflow, Spark, DBT, and GCP Services.


️ Hands-on experience with ETL processes, SQL, PostgreSQL, MySQL, MongoDB, Cassandra.


Strong proficiency in Python and data modelling.


Experience in testing and validation of data pipelines.


Preferred: Experience with eCommerce systems, data visualization tools (Tableau, Looker), and cloud certifications.


If you meet the above criteria and are interested, please share your updated CV along with the following details:


Total Experience:


Current CTC:


Expected CTC:


Current Location:


Preferred Location:


Notice Period / Last Working Day (if serving notice):


⚠️ Kindly share your details only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!

Read more
top MNC

top MNC

Agency job
via Vy Systems by thirega thanasekaran
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Ghaziabad, Faridabad, Pune, Hyderabad, Chennai
6 - 14 yrs
₹6L - ₹25L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Generative AI

Key Responsibilities:

  • Develop and maintain scalable Python applications for AI/ML projects.
  • Design, train, and evaluate machine learning models for classification, regression, NLP, computer vision, or recommendation systems.
  • Collaborate with data scientists, ML engineers, and software developers to integrate models into production systems.
  • Optimize model performance and ensure low-latency inference in real-time environments.
  • Work with large datasets to perform data cleaning, feature engineering, and data transformation.
  • Stay current with new developments in machine learning frameworks and Python libraries.
  • Write clean, testable, and efficient code following best practices.
  • Develop RESTful APIs and deploy ML models via cloud or container-based solutions (e.g., AWS, Docker, Kubernetes).


Share Cv to


Thirega@ vysystems dot com - WhatsApp - 91Five0033Five2Three

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Hyderabad, Bengaluru (Bangalore), Pune, Mumbai, Kolkata, Delhi, Noida
12 - 14 yrs
₹11L - ₹27L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+4 more

Responsibilities

  • Develop and maintain robust APIs to support various applications and services.
  • Design and implement scalable solutions using AWS cloud services.
  • Utilize Python frameworks such as Flask and Django to build efficient and high-performance applications.
  • Collaborate with cross-functional teams to gather and analyze requirements for new features and enhancements.
  • Ensure the security and integrity of applications by implementing best practices and security measures.
  • Optimize application performance and troubleshoot issues to ensure smooth operation.
  • Provide technical guidance and mentorship to junior team members.
  • Conduct code reviews to ensure adherence to coding standards and best practices.
  • Participate in agile development processes including sprint planning daily stand-ups and retrospectives.
  • Develop and maintain documentation for code processes and procedures.
  • Stay updated with the latest industry trends and technologies to continuously improve skills and knowledge.
  • Contribute to the overall success of the company by delivering high-quality software solutions that meet business needs.
  • Foster a collaborative and inclusive work environment that promotes innovation and continuous improvement.

 

Qualifications

  • Possess strong expertise in developing and maintaining APIs.
  • Demonstrate proficiency in AWS cloud services and their application in scalable solutions.
  • Have extensive experience with Python frameworks such as Flask and Django.
  • Exhibit strong analytical and problem-solving skills to address complex technical challenges.
  • Show ability to collaborate effectively with cross-functional teams and stakeholders.
  • Display excellent communication skills to convey technical concepts clearly.
  • Have a background in the Consumer Lending domain is a plus.
  • Demonstrate commitment to continuous learning and staying updated with industry trends.
  • Possess a strong understanding of agile development methodologies.
  • Show experience in mentoring and guiding junior team members.
  • Exhibit attention to detail and a commitment to delivering high-quality software solutions.
  • Demonstrate ability to work effectively in a hybrid work model.
  • Show a proactive approach to identifying and addressing potential issues before they become problems.
Read more
Rigel Networks Pvt Ltd
Minakshi Soni
Posted by Minakshi Soni
Bengaluru (Bangalore), Pune, Mumbai, Chennai
8 - 12 yrs
₹8L - ₹10L / yr
skill iconAmazon Web Services (AWS)
Terraform
Amazon Redshift
Redshift
Snowflake
+16 more

Dear Candidate,


We are urgently Hiring AWS Cloud Engineer for Bangalore Location.

Position: AWS Cloud Engineer

Location: Bangalore

Experience: 8-11 yrs

Skills: Aws Cloud

Salary: Best in Industry (20-25% Hike on the current ctc)

Note:

only Immediate to 15 days Joiners will be preferred.

Candidates from Tier 1 companies will only be shortlisted and selected

Candidates' NP more than 30 days will get rejected while screening.

Offer shoppers will be rejected.


Job description:

 

Description:

 

Title: AWS Cloud Engineer

Prefer BLR / HYD – else any location is fine

Work Mode: Hybrid – based on HR rule (currently 1 day per month)


Shift Timings 24 x 7 (Work in shifts on rotational basis)

Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.

Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting



Experience and Skills Requirements:


Experience:

8 years of experience in a technical role working with AWS


Mandatory

Technical troubleshooting and problem solving

AWS management of large-scale IaaS PaaS solutions

Cloud networking and security fundamentals

Experience using containerization in AWS

Working Data warehouse knowledge Redshift and Snowflake preferred

Working with IaC – Terraform and Cloud Formation

Working understanding of scripting languages including Python and Shell

Collaboration and communication skills

Highly adaptable to changes in a technical environment

 

Optional

Experience using monitoring and observer ability toolsets inc. Splunk, Datadog

Experience using Github Actions

Experience using AWS RDS/SQL based solutions

Experience working with streaming technologies inc. Kafka, Apache Flink

Experience working with a ETL environments

Experience working with a confluent cloud platform


Certifications:


Minimum

AWS Certified SysOps Administrator – Associate

AWS Certified DevOps Engineer - Professional



Preferred


AWS Certified Solutions Architect – Associate


Responsibilities:


Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.


The following is a list of expected responsibilities:


To manage and support a customer’s AWS platform

To be technical hands on

Provide Incident and Problem management on the AWS IaaS and PaaS Platform

Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner

Actively monitor an AWS platform for technical issues

To be involved in the resolution of technical incidents tickets

Assist in the root cause analysis of incidents

Assist with improving efficiency and processes within the team

Examining traces and logs

Working with third party suppliers and AWS to jointly resolve incidents


Good to have:


Confluent Cloud

Snowflake




Best Regards,

Minakshi Soni

Executive - Talent Acquisition (L2)

Rigel Networks

Worldwide Locations: USA | HK | IN 

Read more
Coinfantasy
Indira Priyadharshini
Posted by Indira Priyadharshini
Remote, Chennai
3 - 10 yrs
₹10L - ₹40L / yr
skill iconPython
PyTorch
Large Language Models (LLM) tuning
Large Language Models (LLM)
Generative AI
+2 more

CoinFantasy is looking for an experienced Senior AI Architect to lead both the decentralised protocol development and the design of AI-driven applications on this network. As a visionary in AI and distributed computing, you will play a central role in shaping the protocol’s technical direction, enabling efficient task distribution, and scaling AI use cases across a heterogeneous, decentralised infrastructure.

Job Responsibilities

  • Architect and oversee the protocol’s development, focusing on dynamic node orchestration, layer-wise model sharding, and secure, P2P network communication.
  • Drive the end-to-end creation of AI applications, ensuring they are optimised for decentralised deployment and include use cases with autonomous agent workflows.
  • Architect AI systems capable of running on decentralised networks, ensuring they balance speed, scalability, and resource usage.
  • Design data pipelines and governance strategies for securely handling large-scale, decentralised datasets.
  • Implement and refine strategies for swarm intelligence-based task distribution and resource allocation across nodes. Identify and incorporate trends in decentralised AI, such as federated learning and swarm intelligence, relevant to various industry applications.
  • Lead cross-functional teams in delivering full-precision computing and building a secure, robust decentralised network.
  • Represent the organisation’s technical direction, serving as the face of the company at industry events and client meetings.

Requirements

  • Bachelor’s/Master’s/Ph.D. in Computer Science, AI, or related field.
  • 12+ years of experience in AI/ML, with a track record of building distributed systems and AI solutions at scale.
  • Strong proficiency in Python, Golang, and machine learning frameworks (e.g., TensorFlow, PyTorch).
  • Expertise in decentralised architecture, P2P networking, and heterogeneous computing environments.
  • Excellent leadership skills, with experience in cross-functional team management and strategic decision-making.
  • Strong communication skills, adept at presenting complex technical solutions to diverse audiences.

About Us

CoinFantasy is a Play to Invest platform that brings the world of investment to users through engaging games. With multiple categories of games, it aims to make investing fun, intuitive, and enjoyable for users. It features a sandbox environment in which users are exposed to the end-to-end investment journey without risking financial losses.

Building on this foundation, we are now developing a groundbreaking decentralised protocol that will transform the AI landscape.

Website:

Benefits

  • Competitive Salary
  • An opportunity to be part of the Core team in a fast-growing company
  • A fulfilling, challenging and flexible work experience
  • Practically unlimited professional and career growth opportunities


Read more
Koolioai
Swarna M
Posted by Swarna M
Chennai
0 - 1 yrs
₹15000 - ₹20000 / mo
skill iconPython
skill iconFlask

About koolio.ai


Website: www.koolio.ai


Koolio Inc. is a cutting-edge Silicon Valley startup dedicated to transforming how stories are told through audio. Our mission is to democratize audio content creation by empowering individuals and businesses to effortlessly produce high-quality, professional-grade content. Leveraging AI and intuitive web-based tools, koolio.ai enables creators to craft, edit, and distribute audio content—from storytelling to educational materials, brand marketing, and beyond. We are passionate about helping people and organizations share their voices, fostering creativity, collaboration, and engaging storytelling for a wide range of use cases.


About the Internship Position

We are looking for a motivated Backend Development Intern to join our innovative team. As an intern at koolio.ai, you’ll have the opportunity to work on a next-gen AI-powered platform and gain hands-on experience developing and optimizing backend systems that power our platform. This internship is ideal for students or recent graduates who are passionate about backend technologies and eager to learn in a dynamic, fast-paced startup environment.


Key Responsibilities:

  • Assist in the development and maintenance of backend systems and APIs.
  • Write reusable, testable, and efficient code to support scalable web applications.
  • Work with cloud services and server-side technologies to manage data and optimize performance.
  • Troubleshoot and debug existing backend systems, ensuring reliability and performance.
  • Collaborate with cross-functional teams to integrate frontend features with backend logic.


Requirements and Skills:

  • Education: Currently pursuing or recently completed a degree in Computer Science, Engineering, or a related field.
  • Technical Skills:
  • Good understanding of server-side technologies like Python
  • Familiarity with REST APIs and database systems (e.g., MySQL, PostgreSQL, or NoSQL databases).
  • Exposure to cloud platforms like AWS, Google Cloud, or Azure is a plus.
  • Knowledge of version control systems such as Git.
  • Soft Skills:
  • Eagerness to learn and adapt in a fast-paced environment.
  • Strong problem-solving and critical-thinking skills.
  • Effective communication and teamwork capabilities.
  • Other Skills: Familiarity with CI/CD pipelines and basic knowledge of containerization (e.g., Docker) is a bonus.


Why Join Us?

  • Gain real-world experience working on a cutting-edge platform.
  • Work alongside a talented and passionate team committed to innovation.
  • Receive mentorship and guidance from industry experts.
  • Opportunity to transition to a full-time role based on performance and company needs.


This internship is an excellent opportunity to kickstart your career in backend development, build critical skills, and contribute to a product that has a real-world impact.

Read more
Koolioai
Swarna M
Posted by Swarna M
Remote, Chennai
5 - 7 yrs
₹20L - ₹30L / yr
skill iconPython
skill iconReact.js
skill iconFlask
Google Cloud Platform (GCP)

About koolio.ai

Website: www.koolio.ai

koolio Inc. is a cutting-edge Silicon Valley startup dedicated to transforming how stories are told through audio. Our mission is to democratize audio content creation by empowering individuals and businesses to effortlessly produce high-quality, professional-grade content. Leveraging AI and intuitive web-based tools, koolio.ai enables creators to craft, edit, and distribute audio content—from storytelling to educational materials, brand marketing, and beyond—easily. We are passionate about helping people and organizations share their voices, fostering creativity, collaboration, and engaging storytelling for a wide range of use cases.

About the Full-Time Position

We are seeking experienced Full Stack Developers to join our innovative team on a full-time, hybrid basis. As part of koolio.ai, you will work on a next-gen AI-powered platform, shaping the future of audio content creation. You’ll collaborate with cross-functional teams to deliver scalable, high-performance web applications, handling client- and server-side development. This role offers a unique opportunity to contribute to a rapidly growing platform with a global reach and thrive in a fast-moving, self-learning startup environment where adaptability and innovation are key.

Key Responsibilities:

  • Collaborate with teams to implement new features, improve current systems, and troubleshoot issues as we scale
  • Design and build efficient, secure, and modular client-side and server-side architecture
  • Develop high-performance web applications with reusable and maintainable code
  • Work with audio/video processing libraries for JavaScript to enhance multimedia content creation
  • Integrate RESTful APIs with Google Cloud Services to build robust cloud-based applications
  • Develop and optimize Cloud Functions to meet specific project requirements and enhance overall platform performance

Requirements and Skills:

  • Education: Degree in Computer Science or a related field
  • Work Experience: Minimum of 6+ years of proven experience as a Full Stack Developer or similar role, with demonstrable expertise in building web applications at scale
  • Technical Skills:
  • Proficiency in front-end languages such as HTML, CSS, JavaScript, jQuery, and ReactJS
  • Strong experience with server-side technologies, particularly REST APIs, Python, Google Cloud Functions, and Google Cloud services
  • Familiarity with NoSQL and PostgreSQL databases
  • Experience working with audio/video processing libraries is a strong plus
  • Soft Skills:
  • Strong problem-solving skills and the ability to think critically about issues and solutions
  • Excellent collaboration and communication skills, with the ability to work effectively in a remote, diverse, and distributed team environment
  • Proactive, self-motivated, and able to work independently, balancing multiple tasks with minimal supervision
  • Keen attention to detail and a passion for delivering high-quality, scalable solutions
  • Other Skills: Familiarity with GitHub, CI/CD pipelines, and best practices in version control and continuous deployment

Compensation and Benefits:

  • Total Yearly Compensation: ₹25 LPA based on skills and experience
  • Health Insurance: Comprehensive health coverage provided by the company
  • ESOPs: An opportunity for wealth creation and to grow alongside a fantastic team

Why Join Us?

  • Be a part of a passionate and visionary team at the forefront of audio content creation
  • Work on an exciting, evolving product that is reshaping the way audio content is created and consumed
  • Thrive in a fast-moving, self-learning startup environment that values innovation, adaptability, and continuous improvement
  • Enjoy the flexibility of a full-time hybrid position with opportunities to grow professionally and expand your skills
  • Collaborate with talented professionals from around the world, contributing to a product that has a real-world impact


Read more
Saptang Labs

at Saptang Labs

2 candid answers
Kamaleshm B
Posted by Kamaleshm B
Chennai
1 - 2 yrs
₹4L - ₹7L / yr
Engineering Management
skill iconJava
skill iconNodeJS (Node.js)
skill iconPython
skill iconAndroid Development
+4 more

Responsibilities:

• Analyze and understand business requirements and translate them into efficient, scalable business logic.

• Develop, test, and maintain software that meets new requirements and integrates well with existing systems.

• Troubleshoot and debug software issues and provide solutions.

• Collaborate with cross-functional teams to deliver high-quality products, including product managers, designers, and developers.

• Write clean, maintainable, and efficient code.

• Participate in code reviews and provide constructive feedback to peers.

• Communicate effectively with team members and stakeholders to understand requirements and provide updates.


Required Skills:

• Strong problem-solving skills with the ability to analyze complex issues and provide solutions.

• Ability to quickly understand new problem statements and translate them into functional business logic.

• Proficiency in at least one programming language such as Java, Node.js, or C/C++.

• Strong understanding of software development life cycle (SDLC).

• Excellent communication skills, both verbal and written.

• Team player with the ability to collaborate effectively with different teams.


Preferred Qualifications:

• Experience with Java, Golang, or Rust is a plus.

• Familiarity with cloud platforms, microservices architecture, and API development.

• Prior experience working in an agile environment.

• Strong debugging and optimization skills.


Educational Qualifications:

• Bachelor's degree in Computer Science, Engineering, related field, or equivalent work experience.

Read more
Smartan.ai

at Smartan.ai

2 candid answers
Aadharsh M
Posted by Aadharsh M
Chennai
4 - 8 yrs
₹5L - ₹15L / yr
skill iconPython
NumPy
TensorFlow
PyTorch
Google Cloud Platform (GCP)
+4 more

Role Overview:

We are seeking a highly skilled and motivated Data Scientist to join our growing team. The ideal candidate will be responsible for developing and deploying machine learning models from scratch to production level, focusing on building robust data-driven products. You will work closely with software engineers, product managers, and other stakeholders to ensure our AI-driven solutions meet the needs of our users and align with the company's strategic goals.


Key Responsibilities:

  • Develop, implement, and optimize machine learning models and algorithms to support product development.
  • Work on the end-to-end lifecycle of data science projects, including data collection, preprocessing, model training, evaluation, and deployment.
  • Collaborate with cross-functional teams to define data requirements and product taxonomy.
  • Design and build scalable data pipelines and systems to support real-time data processing and analysis.
  • Ensure the accuracy and quality of data used for modeling and analytics.
  • Monitor and evaluate the performance of deployed models, making necessary adjustments to maintain optimal results.
  • Implement best practices for data governance, privacy, and security.
  • Document processes, methodologies, and technical solutions to maintain transparency and reproducibility.


Qualifications:

  • Bachelor's or Master's degree in Data Science, Computer Science, Engineering, or a related field.
  • 5+ years of experience in data science, machine learning, or a related field, with a track record of developing and deploying products from scratch to production.
  • Strong programming skills in Python and experience with data analysis and machine learning libraries (e.g., Pandas, NumPy, TensorFlow, PyTorch).
  • Experience with cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker).
  • Proficiency in building and optimizing data pipelines, ETL processes, and data storage solutions.
  • Hands-on experience with data visualization tools and techniques.
  • Strong understanding of statistics, data analysis, and machine learning concepts.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work collaboratively in a fast-paced, dynamic environment.


Preferred Qualifications:

  • Knowledge of microservices architecture and RESTful APIs.
  • Familiarity with Agile development methodologies.
  • Experience in building taxonomy for data products.
  • Strong communication skills and the ability to explain complex technical concepts to non-technical stakeholders.
Read more
Optimum

Optimum

Agency job
via Pluginlive by Harsha Saggi
Chennai, Bengaluru (Bangalore)
3 - 14 yrs
₹15L - ₹26L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
SQL

Company: Optimum Solutions

About the company: Optimum solutions is a leader in a sheet metal industry, provides sheet metal solutions to sheet metal fabricators with a proven track record of reliable product delivery. Starting from tools through software, machines, we are one stop shop for all your technology needs.

Role Overview:

  • Creating and managing database schemas that represent and support business processes, Hands-on experience in any SQL queries and Database server wrt managing deployment.
  • Implementing automated testing platforms, unit tests, and CICD Pipeline
  • Proficient understanding of code versioning tools, such as GitHub, Bitbucket, ADO
  • Understanding of container platform, such as Docker

Job Description

  • We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework.
  • Your primary focus will be working the Product and Usecase delivery team to do various prompting for different Gen-AI use cases
  • You will be responsible for prompting and building use case Pipelines
  • Perform the Evaluation of all the Gen-AI features and Usecase pipeline

Position: AI ML Engineer

Location: Chennai (Preference) and Bangalore

Minimum Qualification:  Bachelor's degree in computer science, Software Engineering, Data Science, or a related field.

Experience:  4-6 years

CTC: 16.5 - 17 LPA

Employment Type:  Full Time

Key Responsibilities:

  • Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various Gen-AI base models
  • Design and develop prompts suiting project needs
  • Lead and manage team of prompt engineers
  • Stakeholder management across business and domains as required for the projects
  • Evaluating base models and benchmarking performance
  • Implement prompt gaurdrails to prevent attacks like prompt injection, jail braking and prompt leaking
  •  Develop, deploy and maintain auto prompt solutions
  • Design and implement minimum design standards for every use case involving prompt engineering

Skills and Qualifications

  • Strong proficiency with Python, DJANGO framework and REGEX
  • Good understanding of Machine learning framework Pytorch and Tensorflow
  • Knowledge of Generative AI and RAG Pipeline
  • Good in microservice design pattern and developing scalable application.
  • Ability to build and consume REST API
  • Fine tune and perform code optimization for better performance.
  • Strong understanding on OOP and design thinking
  • Understanding the nature of asynchronous programming and its quirks and workarounds
  • Good understanding of server-side templating languages
  • Understanding accessibility and security compliance, user authentication and authorization between multiple systems, servers, and environments
  • Integration of APIs, multiple data sources and databases into one system
  • Good knowledge in API Gateways and proxies, such as WSO2, KONG, nginx, Apache HTTP Server.
  • Understanding fundamental design principles behind a scalable and distributed application
  • Good working knowledge on Microservices architecture, behaviour, dependencies, scalability etc.
  • Experience in deploying on Cloud platform like Azure or AWS
  • Familiar and working experience with DevOps tools like Azure DEVOPS, Ansible, Jenkins, Terraform
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort