Cutshort logo

50+ Python Jobs in India

Apply to 50+ Python Jobs on CutShort.io. Find your next job, effortlessly. Browse Python Jobs and apply today!

icon
 B2B Automation Platform

B2B Automation Platform

Agency job
via AccioJob by AccioJobHiring Board
Noida
0 - 1 yrs
₹4L - ₹5L / yr
DSA
skill iconPython
skill iconDjango
skill iconFlask

AccioJob is conducting an offline hiring drive with B2B Automation Platform for the position of SDE Trainee Python.


Link for registration: https://go.acciojob.com/6kT7Ea


Position: SDE Trainee Python – DSA, Python, Django/Flask


Eligibility Criteria:

  • Degree: B.Tech / BE / MCA
  • Branch: CS / IT
  • Work Location: Noida

Compensation:

  • CTC: ₹4 - ₹5 LPA
  • Service Agreement: 2-year commitment

Note:

Candidates must be available for face-to-face interviews in Noida and should be ready to join immediately.


Evaluation Process:

Round 1: Assessment at AccioJob Noida Skill Centre

Further Rounds (for shortlisted candidates):

  • Technical Interview 1
  • Technical Interview 2
  • Tech + Managerial Round (Face-to-Face)

Important:

Please bring your laptop for the assessment.


Link for registration: https://go.acciojob.com/6kT7Ea

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Indore
0 - 2 yrs
₹6L - ₹12L / yr
Blockchain
ETL
Artificial Intelligence (AI)
Generative AI
skill iconPython
+3 more

About Us

Alfred Capital - Alfred Capital is a next-generation on-chain proprietary quantitative trading technology provider, pioneering fully autonomous algorithmic systems that reshape trading and capital allocation in decentralized finance. 


As a sister company of Deqode — a 400+ person blockchain innovation powerhouse — we operate at the cutting edge of quant research, distributed infrastructure, and high-frequency execution.


What We Build

  • Alpha Discovery via On‑Chain Intelligence — Developing trading signals using blockchain data, CEX/DEX markets, and protocol mechanics.
  • DeFi-Native Execution Agents — Automated systems that execute trades across decentralized platforms.
  • ML-Augmented Infrastructure — Machine learning pipelines for real-time prediction, execution heuristics, and anomaly detection.
  • High-Throughput Systems — Resilient, low-latency engines that operate 24/7 across EVM and non-EVM chains tuned for high-frequency trading (HFT) and real-time response
  • Data-Driven MEV Analysis & Strategy — We analyze mempools, order flow, and validator behaviors to identify and capture MEV opportunities ethically—powering strategies that interact deeply with the mechanics of block production and inclusion.


Evaluation Process

  • HR Discussion – A brief conversation to understand your motivation and alignment with the role.
  • Initial Technical Interview – A quick round focused on fundamentals and problem-solving approach.
  • Take-Home Assignment – Assesses research ability, learning agility, and structured thinking.
  • Assignment Presentation – Deep-dive into your solution, design choices, and technical reasoning.
  • Final Interview – A concluding round to explore your background, interests, and team fit in depth.
  • Optional Interview – In specific cases, an additional round may be scheduled to clarify certain aspects or conduct further assessment before making a final decision.


Blockchain Data & ML Engineer


As a Blockchain Data & ML Engineer, you’ll work on ingesting and modeling on-chain behavior, building scalable data pipelines, and designing systems that support intelligent, autonomous market interaction.


What You’ll Work On

  • Build and maintain ETL pipelines for ingesting and processing blockchain data.
  • Assist in designing, training, and validating machine learning models for prediction and anomaly detection.
  • Evaluate model performance, tune hyperparameters, and document experimental results.
  • Develop monitoring tools to track model accuracy, data drift, and system health.
  • Collaborate with infrastructure and execution teams to integrate ML components into production systems.
  • Design and maintain databases and storage systems to efficiently manage large-scale datasets.


Ideal Traits

  • Strong in data structures, algorithms, and core CS fundamentals.
  • Proficiency in any programming language
  • Curiosity about how blockchain systems and crypto markets work under the hood.
  • Self-motivated, eager to experiment and learn in a dynamic environment.


Bonus Points For

  • Hands-on experience with pandas, numpy, scikit-learn, or PyTorch.
  • Side projects involving automated ML workflows, ETL pipelines, or crypto protocols.
  • Participation in hackathons or open-source contributions.


What You’ll Gain

  • Cutting-Edge Tech Stack: You'll work on modern infrastructure and stay up to date with the latest trends in technology.
  • Idea-Driven Culture: We welcome and encourage fresh ideas. Your input is valued, and you're empowered to make an impact from day one.
  • Ownership & Autonomy: You’ll have end-to-end ownership of projects. We trust our team and give them the freedom to make meaningful decisions.
  • Impact-Focused: Your work won’t be buried under bureaucracy. You’ll see it go live and make a difference in days, not quarters


What We Value:

  • Craftsmanship over shortcuts: We appreciate engineers who take the time to understand the problem deeply and build durable solutions—not just quick fixes.
  • Depth over haste: If you're the kind of person who enjoys going one level deeper to really "get" how something works, you'll thrive here.
  • Invested mindset: We're looking for people who don't just punch tickets, but care about the long-term success of the systems they build.
  • Curiosity with follow-through: We admire those who take the time to explore and validate new ideas, not just skim the surface.


Compensation:

  • INR 6 - 12 LPA
  • Performance Bonuses: Linked to contribution, delivery, and impact.
Read more
EaseMyTrip.com

at EaseMyTrip.com

1 recruiter
Madhu Sharma
Posted by Madhu Sharma
Gurugram
5 - 6 yrs
₹10L - ₹15L / yr
skill iconPython
Generative AI
skill iconReact.js
  • Strong hands-on experience in Generative AI / LLMs / NLP (OpenAI, LangChain, Hugging Face, etc.).
  • Proficiency in Python for AI/ML model development and backend integration.
  • Experience with React JS for building frontend applications.
  • Familiarity with REST APIs, CI/CD, and agile environments.
  • Solid understanding of data structures, algorithms, and system design.
Read more
Client based at Pune location.
Remote only
5 - 10 yrs
₹15L - ₹20L / yr
BigID
SME
Subject-matter expert
BigID Developer
skill iconPython
+4 more

Job Title: Senior BigID Developer & Operations Specialist (with NLP Expertise)

Location: Remote

Experience Level: 5+ Years


Full-time contract role. initial 6 months contract


Job Summary: We are seeking a highly skilled and experienced BigID Subject Matter Expert (SME) to join our Data Privacy and Governance team. This role is crucial for both the strategic deployment and ongoing operational management of our BigID platform. The ideal candidate will possess deep technical expertise in BigID, strong development skills in Python and Regular Expressions (RegEx), and hands-on experience with Natural Language Processing (NLP) using libraries like SpaCy. You will be instrumental in designing, implementing, and optimizing BigID solutions, ensuring data classification accuracy, and supporting day-to-day operations to meet our stringent data privacy and compliance objectives.

Key Responsibilities:

  • BigID Platform Expertise:
  • Act as the primary SME for BigID, providing expert guidance on its capabilities, best practices, and limitations.
  • Lead the design, implementation, and configuration of BigID solutions across various data sources (structured, unstructured, cloud, on-premise).
  • Develop and manage BigID policies, classification rules, and sensitive data discovery patterns.
  • Configure and optimize BigID scans and data source integrations.
  • Development & Automation (RegEx & Python):
  • Develop and optimize complex Regular Expression (RegEx) patterns for accurate and efficient identification of sensitive data, PII, and custom data types within BigID.
  • Write robust Python scripts for BigID API integration, data manipulation, automation of operational tasks, and custom workflow development.
  • Build custom connectors and data pipelines to integrate BigID with other enterprise systems and data sources.
  • Natural Language Processing (NLP):
  • Leverage SpaCy and other NLP libraries/techniques to enhance BigID's data classification capabilities, particularly for unstructured data.
  • Develop and fine-tune custom NLP models for advanced entity recognition, sentiment analysis, and intelligent sensitive data detection.
  • Apply NLP to improve the accuracy and context of data classifications within BigID.
  • Operational Management & Support:
  • Provide expert day-to-day operational support for the BigID platform, including monitoring system performance, troubleshooting issues, and implementing timely resolutions.
  • Manage BigID job performance, identify bottlenecks, and optimize configurations for scalability and efficiency.
  • Perform regular health checks, maintenance activities, and upgrades for the BigID environment.
  • Develop and maintain comprehensive documentation for BigID configurations, customizations, operational procedures, and incident response.
  • Collaboration & Compliance:
  • Collaborate closely with Data Governance, Security, Privacy, and Legal teams to align BigID capabilities with evolving business requirements and regulatory frameworks (e.g., GDPR, CCPA, HIPAA).
  • Translate privacy and security requirements into technical specifications for BigID implementation.
  • Support data privacy assessments, data subject access requests (DSARs), and other compliance-related activities using BigID.
  • Stay abreast of BigID product updates, industry best practices, and emerging privacy regulations.

Required Skills & Experience:

  • Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field.
  • Minimum of 5+ years of professional experience in data engineering, data governance, privacy engineering, or a related field.
  • Strong hands-on experience (2+ years specific) with BigID implementation, configuration, policy management, and data source integration.
  • Expert-level proficiency in Regular Expressions (RegEx) for advanced pattern matching and sensitive data discovery.
  • Advanced Python scripting skills, including experience with data manipulation, API integration, and automation frameworks.
  • Demonstrable experience with Natural Language Processing (NLP) libraries and techniques, particularly SpaCy, for text analysis, entity recognition, and data classification.
  • Familiarity with REST APIs, JSON, and data ingestion pipelines.
  • Experience working with structured and unstructured data across various platforms (e.g., cloud platforms like AWS S3, Azure Blob, Google Cloud Platform; SQL/NoSQL databases; file systems).
  • Solid understanding of data privacy regulations (GDPR, CCPA, HIPAA, etc.) and data governance principles.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work independently and as part of a collaborative team in a fast-paced environment.

Nice to Have:

  • BigID Certified Professional certification.
  • Experience with BigID App Framework or BigID Studio for building custom connectors or workflows.
  • Exposure to AI/ML-driven data classification or custom NLP model training.
  • Cloud platform certifications (AWS, Azure, GCP).
  • Working knowledge of Identity and Access Management (IAM) tools and data security policies.
  • Experience with other data governance or privacy platforms.


Read more
 engineering and technology company

engineering and technology company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
12 - 15 yrs
₹20L - ₹25L / yr
DevOps
Android.mk
Gradle
cicd
skill iconJenkins
+5 more

Job Overview

  • •   Required experience from 8 years – 12 year in Devops/System Debug.
  • •   Develop and maintain an automated infrastructure of continuous integration and deployment (CI/CD).
  • •   Have experience of creating automated CI/CD pipeline by using tools like Gitlab.
  • •   Demonstrated capability with CI/CD tools such as Jenkins, Git/Gerrit, JFrog(Artifactory, xRay, Pipelines).
  • •   Strong development expertise in Python and Linux scripting languages.
  • •   Have strong knowledge of UNIX, Linux.
  • •   Knowledge on Android Build System (Android.mk , Android.bp, gradle)
  • •   Unit testing/Integration testing and code-coverage tools.
  • •   -Have knowledge of deploying containers by using containerization tools like docker.
  • •   Excellent problem solving and debugging skills and can take ownership on the CI/CD configuration.
  • •   Eliminate variation by working with global engineering teams to define and implement common processes and configuration that work for all projects.
  • •   Maintain and update current scripts/tools to support an evolving software
  • •   Good team player and should follow agile development methodologies and ASPICE practice as part of SW development lifecycle.
  • •   Good understanding of Quality control and Test automation in Agile based Continuous Integration environment.
Read more
Remote only
0 - 1 yrs
₹5000 - ₹5500 / mo
skill iconPython
dbms
skill iconAmazon Web Services (AWS)

Description

Job Description:

Company: Springer Capital

Type: Internship (Remote, Part-Time/Full-Time)

Duration: 3–6 months

Start Date: Rolling

Compensation:


About the role:

We’re building high-performance backend systems that power our financial and ESG intelligence platforms and we want you on the team. As a Backend Engineering Intern, you’ll help us develop scalable APIs, automate data pipelines, and deploy secure cloud infrastructure. This is your chance to work alongside experienced engineers, contribute to real products, and see your code go live.


What You'll Work On:

As a Backend Engineering Intern, you’ll be shaping the systems that power financial insights.


Engineering scalable backend services in Python, Node.js, or Go


Designing and integrating RESTful APIs and microservices


Working with PostgreSQL, MongoDB, or Redis for data persistence


Deploying on AWS/GCP, using Docker, and learning Kubernetes on the fly


Automating infrastructure and shipping faster with CI/CD pipelines


Collaborating with a product-focused team that values fast iteration


What We’re Looking For:


A builder mindset – you like writing clean, efficient code that works


Strong grasp of backend languages (Python, Java, Node, etc.)


Understanding of cloud platforms and containerization basics


Basic knowledge of databases and version control


Students or self-taught engineers actively learning and building


Preferred skills:


Experience with serverless or event-driven architectures


Familiarity with DevOps tools or monitoring systems


A curious mind for AI/ML, fintech, or real-time analytics


What You’ll Get:


Real-world experience solving core backend problems


Autonomy and ownership of live features


Mentorship from engineers who’ve built at top-tier startups


A chance to grow into a full-time offer

Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
9yrs+
Upto ₹30L / yr (Varies
)
skill iconData Analytics
SQL
MS-Excel
skill iconPython
skill iconR Programming
+5 more

Role overview

As a Data Analyst, you should be able to propose creative solutions to develop/solve a business problem. Should be able to recommend design and develop state-of-the-art data-driven analysis using statistical; understating of advanced analytics methodologies to solve business problems & recommend insights. Form hypothesis and run experiments to gain empirical insights and validate the hypothesis. Identify and eliminate possible obstacles and identify an alternative creative solution.


Roles and Responsibilities: -

  • Identify opportunities and partner with key stakeholders to set priorities, manage expectations, facilitate change required to activate insights, and measure the impact
  • Deconstruct problems and goals to form a clear picture for hypothesis generation and use best practices around decision science approaches and technology to solve business challenges
  • Can guide team to Integrate custom analytical solutions (e.g., predictive modeling, segmentation, issue tree frameworks) to support data-driven decision-making
  • Monitors and manages project baseline to ensure activities are occurring as planned - scope, budget and schedule – manages variances
  • Anticipates problems before they occur; defines the problem or risk; identifies possible causes; works with team to identify solutions; selects and implements most appropriate solution
  • Identifies potential points of contention for missed deliverables; creates and implements strategy to mitigate shortfalls in timeline and budget
  • Develop and manage plans to address project strengths, weaknesses, opportunities and threats
  • Translate and communicate results, recommendations, and opportunities to improve data solutions to internal and external leadership with easily consumable reports and presentations.
  • Expected to act independently to deliver projects to schedule, budget and scope; support provided as required and requested, and is self-driven and motivated
  • Able to manage multiple clients, lead technical client calls and act as a bridge between product teams and client


Experience Required:

  • 9 plus years experience.  
  • Experience in design and review of new solution concepts and leading the delivery of high-impact analytics solutions and programs for global clients
  • Should be able to apply domain knowledge to functional areas like market size estimation, business growth strategy, strategic revenue management, marketing effectiveness
  • Have business acumen to manage revenues profitably and meet financial goals consistently. Able to quantify business value for clients and create win-win commercial propositions.
  • Must have the ability to adapt to changing business priorities in a fast-paced business environment
  • Should have the ability to handle structured /unstructured data and have prior experience in loading, validating, and cleaning various types of data
  • Should have a very good understanding of data structures and algorithms
  • This is a Remote (work from home) position.
  • Experience leading and working independently on projects in a fast-paced environment
  • Management skills to manage more than one large, complex projects simultaneously
  • Strong communication and interpersonal skills (includes negotiation)
  • Excellent written and verbal communication skills


Must have technical skills: -

  • IT background with experience across the systems development life cycle with experience in all project phases – plan, initiate, elaborate, design, build, test, implement.
  • Working knowledge of market-leading data analytics tools such as - Spotfire, Tableau, PowerBI, SAP HANA is desired
  • Domain experience of retails/Ecom is plus
  • Well versed with advance SQL/Excel
  • Good with any scripting language/data extraction in Python/R etc.
  • Working knowledge of project management methodology, tools and templates (includes program/project planning, schedule development, scope management and cost management)
Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Gurugram
5 - 10 yrs
₹15L - ₹22L / yr
Large Language Models (LLM)
Generative AI
skill iconPython
Langchaing
Windows Azure
+3 more


Role Title: Senior LLM Engineer - GenAI / ML (Python, Langchain)

Role Overview

We are seeking highly skilled and experienced Senior LLM Engineers with a strong background in Machine Learning and Software Engineering who have transitioned into Generative Al (GenAI) and Large Language Models (LLMs) over the past 3-4 years. This is a hands-on engineering role focused on designing, building, and deploying GenAl-based systems using state-of-the-art frameworks and tools.

The role involves active participation in architectural design, model fine-tuning, and cross-functional collaboration with business stakeholders, data teams, and engineering leaders to deliver enterprise-grade GenAl solutions.

Key Responsibilities

GenAl System Design: Architect and develop GenAI/LLM-based systems using frameworks like LangChain and Retrieval-Augmented Generation (RAG) pipelines.

·

Al Solution Delivery: Translate complex business requirements into scalable, production-ready Al solutions.

Cross-functional Collaboration: Work closely with business SMEs, product owners, and data engineering teams to align Al models with real-world use cases.

System Optimization: Contribute to code reviews, system architecture discussions, and performance tuning of deployed models.

Required Skills

• 7-12 years of total experience in ML/Software Engineering, with 3-4 years of recent experience in LLMs and Generative Al.

• Strong proficiency in Python, LangChain, and SQL.

·


Agent frameworks

Experience working with cloud platforms such as AWS, Azure, or GCP.

· Solid understanding of ML pipelines, deployment strategies, and GenAI use cases.

• Ability to work independently and collaboratively in fast-paced, cross-functional environments.

• Strong verbal and written communication skills; ability to engage effectively with technical and non-technical stakeholders.

Preferred Qualifications

· Minimum 1+ years of hands-on experience specifically in LLM/GenAl-focused implementations.

· Experience delivering ML/AI products from prototyping through to production.

• Familiarity with MLOps, CI/CD, containerization, and scalable Al model deployment.



Read more
Aeries Technology

at Aeries Technology

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
7 - 12 yrs
Upto ₹42L / yr (Varies
)
DevOps
skill iconJava
skill iconPython
Groovy
skill iconC#

This role is part of the Quickbase Center of Excellence, a global initiative operated in partnership with Aeries, and offers an exciting opportunity to work on cutting-edge DevOps technologies with strong collaboration across teams in the US, Bulgaria, and India.

Key Responsibilities

  • Build and manage CI/CD pipelines across environments
  • Automate infrastructure provisioning and deployments using Infrastructure as Code (IaC)
  • Develop internal tools and scripts to boost developer productivity
  • Set up and maintain monitoring, alerting, and performance dashboards
  • Collaborate with cross-functional engineering teams to ensure infrastructure scalability and security
  • Contribute to the DevOps Community of Practice by sharing best practices and tools
  • Continuously evaluate and integrate new technologies and DevOps trends

Skills & Experience Required

  • Strong scripting experience: Bash, PowerShell, Python, or Groovy
  • Hands-on with containerization tools like Docker and Kubernetes
  • Proficiency in Infrastructure as Code: Terraform, CloudFormation, or Azure Templates
  • Experience with CI/CD tools such as Jenkins, TeamCity, GitHub Actions, or CircleCI
  • Exposure to Serverless computing (AWS Lambda or Google App Engine)
  • Cloud experience with AWS, GCP, or Azure
  • Solid understanding of networking concepts: DNS, DHCP, SSL, subnets
  • Experience with monitoring tools and alerting platforms
  • Basic understanding of security principles and best practices
  • Prior experience working directly with software engineering teams

Preferred Qualifications

  • Bachelor’s degree in Computer Science or related discipline
  • Strong communication skills (verbal & written)
  • Ability to work effectively in a distributed, high-performance team
  • Passion for DevOps best practices and a continuous learning mindset
  • Customer-obsessed and committed to improving engineering efficiency

Why Join Us?

  • Quickbase Center of Excellence: Purpose-built team delivering excellence from Bangalore
  • Fast-Growing Environment: Be part of a growing company with strong career advancement
  • Innovative Tech Stack: Exposure to cutting-edge tech in cloud, AI, and DevOps tooling
  • Inclusive Culture: ERGs and leadership development programs to support growth
  • Global Collaboration: Work closely with teams across the US, Bulgaria, and India

About Quickbase

Quickbase is a leading no-code platform that empowers organizations to create enterprise applications without writing code. Founded in 1999 and trusted by over 6,000 customers, Quickbase helps companies connect data, streamline workflows, and achieve real-time insights.

Learn more: https://www.quickbase.com

Read more
Cognida

at Cognida

2 candid answers
Srilatha Swarnam
Posted by Srilatha Swarnam
Hyderabad
12 - 20 yrs
₹30L - ₹60L / yr
Architecture
skill iconPython
Fullstack Developer
User Interface (UI) Design
skill iconReact.js
+1 more

About Cognida.ai:


Our Purpose is to boost your competitive advantage using AI and Analytics.

We Deliver tangible business impact with data-driven insights powered by AI. Drive revenue growth, increase profitability and improve operational efficiencies.

We Are technologists with keen business acumen - Forever curious, always on the front lines of technological advancements. Applying our latest learnings, and tools to solve your everyday business challenges.

We Believe the power of AI should not be the exclusive preserve of the few. Every business, regardless of its size or sector deserves the opportunity to harness the power of AI to make better decisions and drive business value.

We See a world where our AI and Analytics solutions democratise decision intelligence for all businesses. With Cognida.ai, our motto is ‘No enterprise left behind’.


Position: Python Fullstack Architect

Location: Hyderabad

Job Summary

We’re seeking a seasoned Python Fullstack Architect with 15+ years of experience to lead solution design, mentor teams, and drive technical excellence across projects. You'll work closely with stakeholders, contribute to architecture governance, and integrate modern technologies across the stack.

Key Responsibilities

  • Design and review Python-based fullstack solution architectures.
  • Guide development teams on best practices, modern frameworks, and cloud-native patterns.
  • Engage with clients to translate business needs into scalable technical solutions.
  • Stay current with tech trends and contribute to internal innovation initiatives.

Required Skills

  • Strong expertise in Python (Django/Flask/FastAPI) and frontend frameworks (React, Angular, etc.).
  • Cloud experience (AWS, Azure, or GCP) and DevOps/CI-CD setup.
  • Familiarity with enterprise tools: RabbitMQ, Kafka, OAuth2, PostgreSQL, MongoDB.
  • Solid understanding of microservices, API design, batch/stream processing.
  • Strong leadership, mentoring, and architectural problem-solving skills.


Read more
Product company for financial operations automation platform

Product company for financial operations automation platform

Agency job
via Esteem leadership by Suma Raju
Hyderabad
4 - 6 yrs
₹20L - ₹25L / yr
skill iconPython
skill iconJava
skill iconKubernetes
Google Cloud Platform (GCP)

Mandatory Criteria

  • Candidate must have Strong hands-on experience with Kubernetes of at least 2 years in production environments.
  • Candidate should have Expertise in at least one public cloud platform [GCP (Preferred), AWS, Azure, or OCI).
  • Proficient in backend programming with Python, Java, or Kotlin (at least one is required).
  • Candidate should have strong Backend experience.
  • Hands-on experience with BigQuery or Snowflake for data analytics and integration.


About the Role


We are looking for a highly skilled and motivated Cloud Backend Engineer with 4–7 years of experience, who has worked extensively on at least one major cloud platform (GCP, AWS, Azure, or OCI). Experience with multiple cloud providers is a strong plus. As a Senior Development Engineer, you will play a key role in designing, building, and scaling backend services and infrastructure on cloud-native platforms.

# Experience with Kubernetes is mandatory.

 

Key Responsibilities

  • Design and develop scalable, reliable backend services and cloud-native applications.
  • Build and manage RESTful APIs, microservices, and asynchronous data processing systems.
  • Deploy and operate workloads on Kubernetes with best practices in availability, monitoring, and cost-efficiency.
  • Implement and manage CI/CD pipelines and infrastructure automation.
  • Collaborate with frontend, DevOps, and product teams in an agile environment.
  • Ensure high code quality through testing, reviews, and documentation.

 

Required Skills

  • Strong hands-on experience with Kubernetes of atleast 2 years in production environments (mandatory).
  • Expertise in at least one public cloud platform [GCP (Preferred)AWSAzure, or OCI].
  • Proficient in backend programming with PythonJava, or Kotlin (at least one is required).
  • Solid understanding of distributed systems, microservices, and cloud-native architecture.
  • Experience with containerization using Docker and Kubernetes-native deployment workflows.
  • Working knowledge of SQL and relational databases.

  

Preferred Qualifications

  • Experience working across multiple cloud platforms.
  • Familiarity with infrastructure-as-code tools like Terraform or CloudFormation.
  • Exposure to monitoring, logging, and observability stacks (e.g., Prometheus, Grafana, Cloud Monitoring).
  • Hands-on experience with BigQuery or Snowflake for data analytics and integration.

  

Nice to Have

  • Knowledge of NoSQL databases or event-driven/message-based architectures.
  • Experience with serverless services, managed data pipelines, or data lake platforms.
Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹30L / yr
Hadoop
Apache Kafka
Spark
redshift
skill iconPython
+9 more

Senior Data Engineer Job Description

Overview

The Senior Data Engineer will design, develop, and maintain scalable data pipelines and

infrastructure to support data-driven decision-making and advanced analytics. This role requires deep

expertise in data engineering, strong problem-solving skills, and the ability to collaborate with

cross-functional teams to deliver robust data solutions.

Key Responsibilities


Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data

pipelines to ingest, process, and transform large volumes of structured and unstructured data.

Data Architecture: Architect and maintain data storage solutions, including data lakes, data

warehouses, and databases, ensuring performance, scalability, and cost-efficiency.

Data Integration: Integrate data from diverse sources, including APIs, third-party systems,

and streaming platforms, ensuring data quality and consistency.

Performance Optimization: Monitor and optimize data systems for performance, scalability,

and cost, implementing best practices for partitioning, indexing, and caching.

Collaboration: Work closely with data scientists, analysts, and software engineers to

understand data needs and deliver solutions that enable advanced analytics, machine

learning, and reporting.

Data Governance: Implement data governance policies, ensuring compliance with data

security, privacy regulations (e.g., GDPR, CCPA), and internal standards.

Automation: Develop automated processes for data ingestion, transformation, and validation

to improve efficiency and reduce manual intervention.

Mentorship: Guide and mentor junior data engineers, fostering a culture of technical

excellence and continuous learning.

Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high

availability and reliability of data systems.

Required Qualifications

Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,

or a related field.

Experience: 5+ years of experience in data engineering or a related role, with a proven track

record of building scalable data pipelines and infrastructure.

Technical Skills:

Proficiency in programming languages such as Python, Java, or Scala.

Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).

Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services

(e.g., Redshift, BigQuery, Snowflake).

Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and

data integration frameworks.

Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed

systems.

Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a

plus.

Soft Skills:

Excellent problem-solving and analytical skills.

Strong communication and collaboration abilities.

Ability to work in a fast-paced, dynamic environment and manage multiple priorities.

Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,

Google Professional Data Engineer) or relevant data engineering certifications.

Preferred Qualifica

Experience with real-time data processing and streaming architectures.

Familiarity with machine learning pipelines and MLOps practices.

Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data

pipelines.

Experience in industries with high data complexity, such as finance, healthcare, or

e-commerce.

Work Environment

Location: Hybrid/Remote/On-site (depending on company policy).

Team: Collaborative, cross-functional team environment with data scientists, analysts, and

business stakeholders.

Hours: Full-time, with occasional on-call responsibilities for critical data systems.

Read more
Product company for financial operations automation platform

Product company for financial operations automation platform

Agency job
via Esteem leadership by Suma Raju
Hyderabad
4 - 5 yrs
₹20L - ₹25L / yr
skill iconPython
skill iconKubernetes
Google Cloud Platform (GCP)
skill iconJava
skill iconAmazon Web Services (AWS)

Mandatory Criteria :

  • Candidate must have Strong hands-on experience with Kubernetes of atleast 2 years in production environments.
  • Candidate should have Expertise in at least one public cloud platform [GCP (Preferred), AWS, Azure, or OCI).
  • Proficient in backend programming with Python, Java, or Kotlin (at least one is required).
  • Candidate should have strong Backend experience.
  • Hands-on experience with BigQuery or Snowflake for data analytics and integration.


About the Role


We are looking for a highly skilled and motivated Cloud Backend Engineer with 4–7 years of experience, who has worked extensively on at least one major cloud platform (GCP, AWS, Azure, or OCI). Experience with multiple cloud providers is a strong plus. As a Senior Development Engineer, you will play a key role in designing, building, and scaling backend services and infrastructure on cloud-native platforms.

# Experience with Kubernetes is mandatory.


Key Responsibilities

  • Design and develop scalable, reliable backend services and cloud-native applications.
  • Build and manage RESTful APIs, microservices, and asynchronous data processing systems.
  • Deploy and operate workloads on Kubernetes with best practices in availability, monitoring, and cost-efficiency.
  • Implement and manage CI/CD pipelines and infrastructure automation.
  • Collaborate with frontend, DevOps, and product teams in an agile environment.
  • Ensure high code quality through testing, reviews, and documentation.

 

Required Skills

  • Strong hands-on experience with Kubernetes of atleast 2 years in production environments (mandatory).
  • Expertise in at least one public cloud platform [GCP (Preferred)AWSAzure, or OCI].
  • Proficient in backend programming with PythonJava, or Kotlin (at least one is required).
  • Solid understanding of distributed systems, microservices, and cloud-native architecture.
  • Experience with containerization using Docker and Kubernetes-native deployment workflows.
  • Working knowledge of SQL and relational databases.

  

Preferred Qualifications

  • Experience working across multiple cloud platforms.
  • Familiarity with infrastructure-as-code tools like Terraform or CloudFormation.
  • Exposure to monitoring, logging, and observability stacks (e.g., Prometheus, Grafana, Cloud Monitoring).
  • Hands-on experience with BigQuery or Snowflake for data analytics and integration.

 

Nice to Have

  • Knowledge of NoSQL databases or event-driven/message-based architectures.
  • Experience with serverless services, managed data pipelines, or data lake platforms.


Read more
HeyCoach
Bengaluru (Bangalore)
0 - 1 yrs
₹1.3L - ₹1.5L / yr
skill iconData Science
skill iconPython
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Statistical Analysis
+2 more

About the Role

We are seeking a motivated and knowledgeable Data Science Teaching Assistant Intern to support our academic team in delivering high-quality learning experiences. This role is ideal for someone who enjoys teaching, solving problems, and wants to gain hands-on experience in the EdTech and Data Science domain.


As a Teaching Assistant, you'll help learners understand complex data science topics, resolve doubts, assist during live classes, and contribute to high-quality content development.


Opportunity to receive a Pre-Placement Offer (PPO) based on performance.


Key Responsibilities

 Assist instructors during live classes by providing support and addressing learners queries.

Conduct doubt-solving sessions to help learners grasp difficult concepts in Data Science, Python, Machine Learning, and related topics.

 Contribute to content creation and review, including assignments, quizzes, and learning materials.

 Provide one-on-one academic support and mentoring to learners when needed.

 Ensure a positive and engaging learning environment during sessions.


Requirements

 Bachelor's in Data Science, CSE, Statistics, or a related field

 Strong foundation in Python, Statistics, Machine Learning, and Data Analysis.

 Excellent communication and interpersonal skills.

 Ability to break down technical concepts into simple explanations.

 Prior experience in teaching, mentoring, or assisting is a plus.

 Passionate about education and helping others learn.


Perks

 Hands-on teaching and mentoring experience.

 Exposure to real-time learners interaction and feedback.

 Mentorship from senior instructors and data science professionals.

 Opportunity to receive a Pre-Placement Offer (PPO) based on performance.

Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Hyderabad
5 - 8 yrs
₹6L - ₹20L / yr
skill iconPython
Automation
Manual testing
Functional testing

We’re looking for a strong QA Engineer with 5+ years hands-on experience in Python to join a fast-paced team and contribute from Day 1.


What you’ll be doing:

🔹 Jump directly into writing Python scripts for web and API automation

🔹 Maintain and extend a Selenium automation framework developed in Python

🔹 Collaborate with developers and product teams to ensure high-quality releases

🔹Own testing for core modules and APIs


Must-Have Skills:

✅ Strong functional QA background

✅ Proficiency in 𝐏𝐲𝐭𝐡𝐨𝐧 (𝐓𝐡𝐢𝐬 𝐢𝐬 𝐚 𝐦𝐮𝐬𝐭)

✅ Hands-on experience with Selenium automation using Python

✅ Ability to work with and manage existing Python-based automation frameworks

✅ Experience in Web and API testing

Read more
HeyCoach
DeepanRaj R
Posted by DeepanRaj R
Bengaluru (Bangalore)
0 - 1 yrs
₹3.5L - ₹4L / yr
skill iconC++
skill iconJava
skill iconPython
Data Structures
Problem solving
+7 more

Location: HSR Sector 6, Bengaluru, India

Job Type: Full-Time - WFO

Work Timings: Wednesday to Sunday: 11:00 AM - 8:00 PM

Monday: 11:00 AM - 5:00 PM Tuesday: Off

Salary: 3.5 - 4 LPA

0-1 year experience


About HeyCoach:

We are an exceptional group of highly skilled individuals, passionate about addressing a fundamental challenge within the education industry. Our team consists of talented geeks who possess a deep understanding of the issues at hand and are dedicated to finding innovative solutions. In our quest for excellence, we are constantly seeking out remarkable individuals who can contribute to our growth and success.

Whether it's developing cutting-edge technologies, designing immersive learning experiences, or implementing groundbreaking teaching methodologies, we consistently strive for excellence.


About the role:

As a Competitive Programming Engineer at HeyCoach, you will play a pivotal role in building the backbone of essential tools that our learners will utilize to excel in interview preparation and competitive programming. This is a full-time position, ideal for individuals who have recently graduated and possess a strong background in Competitive Programming.


Responsibilities:

● Algorithmic Problem Solving: Demonstrate proficiency in solving complex algorithmic problems and challenges.

● Tool Development: Contribute to the design and development of tools that will aid learners in their competitive programming and interview preparation journey.

Educational Content Support: Collaborate with the content development team to provide technical insights and support in creating educational content related to competitive

programming.

● Quality Assurance: Ensure the quality and efficiency of tools and resources developed, with a keen eye for detail and functionality.

● Research and Development: Stay abreast of the latest trends and technologies in competitive programming and problem-solving domains. Contribute to ongoing research initiatives.

Collaborative Teamwork: Work closely with cross-functional teams, including developers, educators, and content creators, to align tool development with educational objectives.


Qualifications:

● Bachelor's degree in Computer Science/Engineering or relevant field.

● Strong experience or knowledge of data structures, algorithms, and competitive programming principles.

● Proficiency in at least one programming language (e.g., Python, Java, C++).

● Excellent problem-solving skills and the ability to translate concepts into practical solutions.

● Recent graduates or candidates with relevant competitive programming internships are encouraged to apply.


Preferred Skills:

● Familiarity with educational technology tools and platforms.

● Passion for enhancing the learning experience for individuals aspiring to crack interviews.

● Effective communication and teamwork skills.

● Mandatory practices on either of the platforms, Leetcode, Codeforces, CodeChef or GeeksforGeeks, Topcode.

Read more
EZSpace Ventures OPC Pvt Ltd
Bhopal
5 - 10 yrs
₹5L - ₹15L / yr
skill iconPython
MERN Stack
Artificial Intelligence (AI)
skill iconMachine Learning (ML)

Job description


Brief Description

One of our client is looking for a Lead Engineer in Bhopal with 5–10 years of experience. Candidates must have strong expertise in Python. Additional experience in AI/ML, MERN Stack, or Full Stack Development is a plus.


Job Description

We are seeking a highly skilled and experienced Lead Engineer – Python AI to join our dynamic team. The ideal candidate will have a strong background in AI technologies, MERN stack, and Python full stack development, with a passion for building scalable and intelligent systems. This role involves leading development efforts, mentoring junior engineers, and collaborating with cross-functional teams to deliver cutting-edge AI-driven solutions.


Key Responsibilities:

  • Lead the design, development, and deployment of AI-powered applications using Python and MERN stack.
  • Architect scalable and maintainable full-stack solutions integrating AI models and data pipelines.
  • Collaborate with data scientists and product teams to integrate machine learning models into production systems.
  • Ensure code quality, performance, and security across all layers of the application.
  • Mentor and guide junior developers, fostering a culture of technical excellence.
  • Stay updated with emerging technologies in AI, data engineering, and full-stack development.
  • Participate in code reviews, sprint planning, and technical discussions.


Required Skills:

  • 5+ years of experience in software development with a strong focus on Python full stack and MERN stack.
  • Hands-on experience with AI technologies, machine learning frameworks (e.g., TensorFlow, PyTorch), and data processing tools.
  • Proficiency in MongoDB, Express.js, React.js, Node.js.
  • Strong understanding of RESTful APIs, microservices architecture, and cloud platforms (AWS, Azure, GCP).
  • Experience with CI/CD pipelines, containerization (Docker), and version control (Git).
  • Excellent problem-solving skills and ability to work in a fast-paced environment.


Education Qualification:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Certifications in AI/ML or Full Stack Development are a plus.


Read more
TalentLo

at TalentLo

2 candid answers
Satyansh A
Posted by Satyansh A
Remote only
0 - 2 yrs
₹1L - ₹1L / yr
NumPy
pandas
skill iconPython
Scikit-Learn

Required Skills:

•           Basic understanding of machine learning concepts and algorithms

•           Proficiency in Python and relevant libraries (NumPy, Pandas, scikit-learn)

•           Familiarity with data preprocessing techniques

•           Knowledge of basic statistical concepts

•           Understanding of model evaluation metrics

•           Basic experience with at least one deep learning framework (TensorFlow, PyTorch)

•           Strong analytical and problem-solving abilities

 

 

Application Process: Create your profile on our platform, submit your portfolio, GitHub profile, or sample projects.

https://www.talentlo.com/

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Remote, Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconPython
SDET
BDD
SQL
Data Warehouse (DWH)
+2 more

Primary skill set: QA Automation, Python, BDD, SQL 

As Senior Data Quality Engineer you will:

  • Evaluate product functionality and create test strategies and test cases to assess product quality.
  • Work closely with the on-shore and the offshore team.
  • Work on multiple reports validation against the databases by running medium to complex SQL queries.
  • Better understanding of Automation Objects and Integrations across various platforms/applications etc.
  • Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
  • Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
  • Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
  • Establish processes and tools set to maintain automation scripts and generate regular test reports.
  • Peer review to provide feedback and to make sure the test scripts are flaw-less.

Core/Must have skills:

  • Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
  • Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
  • Clear & crisp communication and commitment towards deliverables
  • Experience on BigData Testing will be an added advantage.
  • Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.

Good to have skills:

  • Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
  • Ability to effectively articulate technical challenges and solutions
  • Work experience in qTest, Jira, WebDriver.IO


Read more
Deltek
Remote only
7 - 12 yrs
Best in industry
skill iconPython
skill iconJava
skill icon.NET
skill iconReact.js
TypeScript
+1 more

Title - Pncpl Software Engineer

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

Business Summary :

The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

Principal Software Engineer

Position Responsibilities :

  • Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
  • Develop scalable, performant APIs for Deltek products
  • Accountability for the successful implementation of the requirements by the team.
  • Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
  • Undertake analysis, design, coding and testing activities of complex modules
  • Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
  • Participate in code reviews and provide mentorship to junior developers.
  • Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React. And suggest optimisations based on them
  • Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
  • Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
  • Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.

Qualifications :

  • A college degree in Computer Science, Software Engineering, Information Science or a related field is required 
  • Minimum 8-10 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
  • Experience in backend development and Apache Airflow (or equivalent framework).
  • Build APIs and optimize SQL queries with performance considerations.
  • Experience with Agile Development
  • Experience in writing and maintaining unit tests and using testing frameworks is desirable
  • Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
  • Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
  • The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
  • Strong problem-solving and debugging skills.
  • Ability to work in an Agile environment and collaborate with cross-functional teams.
  • Familiarity with version control systems like Git.
  • Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.

Read more
MyOperator - VoiceTree Technologies

at MyOperator - VoiceTree Technologies

1 video
3 recruiters
Vijay Muthu
Posted by Vijay Muthu
Remote only
3 - 5 yrs
₹9L - ₹10L / yr
skill iconPython
skill iconDjango
FastAPI
Microservices
Large Language Models (LLM)
+12 more

About Us:

MyOperator and Heyo are India’s leading conversational platforms empowering 40,000+ businesses with Call and WhatsApp-based engagement. We’re a product-led SaaS company scaling rapidly, and we’re looking for a skilled Software Developer to help build the next generation of scalable backend systems.


Role Overview:

We’re seeking a passionate Python Developer with strong experience in backend development and cloud infrastructure. This role involves building scalable microservices, integrating AI tools like LangChain/LLMs, and optimizing backend performance for high-growth B2B products.


Key Responsibilities:

  • Develop robust backend services using Python, Django, and FastAPI
  • Design and maintain scalable microservices architecture
  • Integrate LangChain/LLMs into AI-powered features
  • Write clean, tested, and maintainable code with pytest
  • Manage and optimize databases (MySQL/Postgres)
  • Deploy and monitor services on AWS
  • Collaborate across teams to define APIs, data flows, and system architecture


Must-Have Skills:

  • Python and Django
  • MySQL or Postgres
  • Microservices architecture
  • AWS (EC2, RDS, Lambda, etc.)
  • Unit testing using pytest
  • LangChain or Large Language Models (LLM)
  • Strong grasp of Data Structures & Algorithms
  • AI coding assistant tools (e.g., Chat GPT & Gemini)


Good to Have:

  • MongoDB or ElasticSearch
  • Go or PHP
  • FastAPI
  • React, Bootstrap (basic frontend support)
  • ETL pipelines, Jenkins, Terraform


Why Join Us?

  • 100% Remote role with a collaborative team
  • Work on AI-first, high-scale SaaS products
  • Drive real impact in a fast-growing tech company
  • Ownership and growth from day one
Read more
Robylon AI

at Robylon AI

2 candid answers
Listings Robylon
Posted by Listings Robylon
Bengaluru (Bangalore)
0 - 2 yrs
₹5L - ₹6L / yr
skill iconPython
Generative AI
Prompt engineering

Role Overview

This is a 20% technical, 80% non-technical role designed for individuals who can blend technical know-how with strong operational and communication skills. You’ll be the bridge between our product and the client’s operations team.


Key Responsibilities


  • Collaborate with clients to co-design SOPs for resolving support queries across channels (chat, ticket, voice)
  • Scope and plan each integration: gather technical and operational requirements and convert them into an executable timeline with measurable success metrics (e.g., coverage %, accuracy, CSAT)
  • Lead integration rollouts and post-launch success loops: monitor performance, debug issues, fine-tune prompts and workflows
  • Conduct quarterly “AI health-checks” and continuously improve system effectiveness
  • Troubleshoot production issues, replicate bugs, ship patches, and write clear root-cause analyses (RCAs)
  • Act as the customer’s voice internally, channel key insights to product and engineering teams


Must-Have Qualifications


  • Engineering degree is a must; Computer Science preferred
  • Past experience in coding and a sound understanding of APIs is preferred
  • Ability to communicate clearly with both technical and non-technical stakeholders
  • Experience working in SaaS, customer success, implementation, or operations roles
  • Analytical mindset with the ability to make data-driven decisions



Read more
DAITA

at DAITA

5 candid answers
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
3 - 7 yrs
Upto ₹65L / yr (Varies
)
skill iconNodeJS (Node.js)
skill iconPython
skill iconJava
skill iconRuby on Rails (ROR)
skill iconGo Programming (Golang)
+13 more

Who We Are

DAITA is a German AI start-up. We’re transforming the fashion supply chain with AI-powered agents that automate the mundane, freeing teams to focus on creativity, strategy, and growth.

After a successful research phase spanning 8 countries across 3 continents—gathering insights from Indian cotton fields to German retailers—we’ve secured pre-seed funding and key industry partnerships.

Now, we’re building our MVP to deliver speed, precision, and ease to one of the world’s biggest industries.

We’re set on hypergrowth, aiming to redefine textiles with intelligent, scalable tech—and this is your chance to join the ground floor of something huge.


What You’ll Do

As our Chief Engineer, you’ll lead the technical charge to make our vision real, starting with our MVP in a 3–5 month sprint. You’ll:

  • Design and code an AI-driven/agent system (leveraging machine learning and NLP) with integrated workflow automation to streamline and automate tasks in the textile supply chain, owning it from scratch to finish.
  • Develop backend systems, utilize cutting-edge tools, critically assess manpower needs beyond yourself, oversee a small support team, and drive toward our aggressive launch timeline.
  • Collaborate closely with our founders to align tech with ambitious goals and client input, ensuring automated workflows deliver speed, precision, and ease to textile industry stakeholders.
  • Build an MVP that scales to millions, integrating APIs and data pipelines, using major cloud platforms (AWS, Azure, Google Cloud)—keeping us nimble now and primed for explosive growth later.


What You Bring

  • 2–5 years of experience at high-growth startups or leading tech firms—where you shipped real products, solved complex problems, and moved fast.
  • End-to-end ownership: You've taken tech projects from zero to one—built systems from scratch, made architecture decisions, handled messy edge cases, and delivered under pressure.
  • Team Leadership: 1–3 years leading engineering teams, ideally including recruitment and delivery in India.
  • Technical horsepower: AI Agent Experience, strong across full-stack or backend engineering, ML/NLP integration, cloud architecture, and API/data pipeline development. Experience with workflow automation tools and platforms (e.g., Apache Airflow, UiPath, or similar) to automate processes, ideally in supply chain or textiles. You can code an MVP solo if needed.
  • Resource Clarity: Bring as much technical expertise as possible to build our MVP, and if you can’t own every piece, clearly identify the specific areas where you’ll need team members to deliver on time.
  • Vision Alignment: You think like a builder, taking ownership of the product and team as if it were your own, while partnering closely with the founders to execute their vision with trust and decisiveness.
  • Execution DNA: You ship fast, iterate intelligently, and know when to be scrappy vs. when to be solid.
  • Problem-First Thinking: You’re obsessed with solving real user problems, understanding stakeholder needs beyond just writing beautiful code.
  • High-Energy Leadership: Hands-on, humble, and always ready to jump into the trenches. You lead by doing.
  • Geographical Fit: India-based, ideally with previous exposure to international teams or founders.
  • Values-driven: You live our culture—live in the future, move fast, one team, and character above all.


Why Join Us?

  • Be the technical linchpin of a hypergrowth startup—build the MVP that launches us into the stratosphere.
  • Competitive salary and equity options to negotiate—own a piece of something massive.
  • On-site in our Tiruppur (Tamil Nadu) offices for 2 months with the German founders, to sync with the founders, then remote flexibility long-term.
  • A full-time role demanding full availability—put in the time needed to smash deadlines and reshape the second-biggest industry on Earth with a team that moves fast and rewards hustle.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.


Read more
HaystackAnalytics
Careers Hr
Posted by Careers Hr
Navi Mumbai
1 - 4 yrs
₹6L - ₹12L / yr
skill iconRust
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconData Science
+2 more

Position – Python Developer

Location – Navi Mumbai


Who are we

Based out of IIT Bombay, HaystackAnalytics is a HealthTech company creating clinical genomics products, which enable diagnostic labs and hospitals to offer accurate and personalized diagnostics. Supported by India's most respected science agencies (DST, BIRAC, DBT), we created and launched a portfolio of products to offer genomics in infectious diseases. Our genomics-based diagnostic solution for Tuberculosis was recognized as one of the top innovations supported by BIRAC in the past 10 years, and was launched by the Prime Minister of India in the BIRAC Showcase event in Delhi, 2022.


Objectives of this Role:

  • Design and implement efficient, scalable backend services using Python.
  • Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions.
  • Build APIs, services, and scripts to support data processing pipelines and front-end applications.
  • Automate recurring tasks and ensure robust integration with cloud services.
  • Maintain high standards of software quality and performance using clean coding principles and testing practices.
  • Collaborate within the team to upskill and unblock each other for faster and better outcomes.





Primary Skills – Python Development

  • Proficient in Python 3 and its ecosystem
  • Frameworks: Flask / Django / FastAPI
  • RESTful API development
  • Understanding of OOPs and SOLID design principles
  • Asynchronous programming (asyncio, aiohttp)
  • Experience with task queues (Celery, RQ)
  • Rust programming experience for systems-level or performance-critical components

Testing & Automation

  • Unit Testing: PyTest / unittest
  • Automation tools: Ansible / Terraform (good to have)
  • CI/CD pipelines

DevOps & Cloud

  • Docker, Kubernetes (basic knowledge expected)
  • Cloud platforms: AWS / Azure / GCP
  • GIT and GitOps workflows
  • Familiarity with containerized deployment & serverless architecture

Bonus Skills

  • Data handling libraries: Pandas / NumPy
  • Experience with scripting: Bash / PowerShell
  • Functional programming concepts
  • Familiarity with front-end integration (REST API usage, JSON handling)

 Other Skills

  • Innovation and thought leadership
  • Interest in learning new tools, languages, workflows
  • Strong communication and collaboration skills
  • Basic understanding of UI/UX principles


To know more about ushttps://haystackanalytics.in




Read more
Service Based Co

Service Based Co

Agency job
via Vikash Technologies by Rishika Teja
Remote only
7 - 12 yrs
₹15L - ₹25L / yr
skill iconAmazon Web Services (AWS)
databricks
lakehouse
skill iconPython
Spark SQL

Job Description:


7–10 years of data engineering experience, with 5+ years on Databricks and Apache Spark. 


 Expert-level hands-on experience with Databricks and AWS (S3, Glue, EMR, Kinesis, Lambda, IAM, CloudWatch). 


 Primary language: Python; strong skills in Spark SQL. 


Deep understanding of Lakehouse architecture, Delta Lake, Parquet, Iceberg.


 Strong experience with Databricks Workflows, Unity Catalog, Runtime upgrades, and cost optimization. 


Experience with Databricks native monitoring tools and Datadog integration.


 Security and compliance expertise across data governance and infrastructure layers.


 Experience with CI/CD automation using Terraform, CloudFormation, and Git. 


Hands-on experience with disaster recovery and multi-region architecture. 


Strong problem-solving, debugging, and documentation skills. 


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Poornima Varadarajan
Posted by Poornima Varadarajan
Mumbai
1 - 8 yrs
₹8L - ₹20L / yr
Object Oriented Programming (OOPs)
Data Structures
Algorithms
skill iconPython

Experience in Python (Only Backend), Data structures, Oops, Algorithms, Django, NumPy etc.

• Good understanding of writing Unit Tests using PYTest.

• Good understanding of parsing XML’s and handling files using Python.

• Good understanding with Databases/SQL, procedures and query tuning.

• Service Design Concepts, OO and Functional Development concepts.

• Agile Development Methodologies.

• Strong oral and written communication skills.

• Excellent interpersonal skills and professional approach Skills desired.


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore), Hyderabad, Pune
4 - 8 yrs
₹20L - ₹25L / yr
Big Data
skill iconAmazon Web Services (AWS)
IaaS
Platform as a Service (PaaS)
VMS
+8 more

Job description

 

Job Title: Cloud Migration Consultant – (AWS to Azure)

 


Experience: 4+ years in application assessment and migration

 

About the Role

 

We’re looking for a Cloud Migration Consultant with hands-on experience assessing and migrating complex applications to Azure. You'll work closely with Microsoft business units, participating in Intake & Assessment and Planning & Design phases, creating migration artifacts, and leading client interactions. You’ll also support application modernization efforts in Azure, with exposure to AWS as needed.

 

Key Responsibilities

 

  • Assess application readiness and document architecture, dependencies, and migration strategy.
  • Conduct interviews with stakeholders and generate discovery insights using tools like Azure MigrateCloudockItPowerShell.
  • Create architecture diagramsmigration playbooks, and maintain Azure DevOps boards.
  • Set up applications both on-premises and in cloud environments (primarily Azure).
  •  Support proof-of-concepts (PoCs) and advise on migration options.
  •  Collaborate with application, database, and infrastructure teams to enable smooth transition to migration factory teams.
  •  Track progress, blockers, and risks, reporting timely status to project leadership.


Required Skills

 

  • 4+ years of experience in cloud migration and assessment
  •  Strong expertise in Azure IaaS/PaaS (VMs, App Services, ADF, etc.)
  •  Familiarity with AWS IaaS/PaaS (EC2, RDS, Glue, S3)
  •  Experience with Java (SpringBoot)/C#, .Net/PythonAngular/React.js, REST APIs
  • Working knowledge of KafkaDocker/KubernetesAzure DevOps
  •  Network infrastructure understanding (VNets, NSGs, Firewalls, WAFs)
  •  IAM knowledge: OAuth, SAML, Okta/SiteMinder
  •  Experience with Big Data tools like Databricks, Hadoop, Oracle, DocumentDB


Preferred Qualifications

 

  • Azure or AWS certifications
  •  Prior experience with enterprise cloud migrations (especially in Microsoft ecosystem)
  •  Excellent communication and stakeholder management skills


Educational qualification:

 

B.E/B.Tech/MCA

 

Experience :

 

4+ Years

 

Key Responsibilities

 

  • Assess application readiness and document architecture, dependencies, and migration strategy.
  •  Conduct interviews with stakeholders and generate discovery insights using tools like Azure MigrateCloudockItPowerShell.
  •  Create architecture diagramsmigration playbooks, and maintain Azure DevOps boards.
  •  Set up applications both on-premises and in cloud environments (primarily Azure).
  •  Support proof-of-concepts (PoCs) and advise on migration options.
  •  Collaborate with application, database, and infrastructure teams to enable smooth transition to migration factory teams.
  •  Track progress, blockers, and risks, reporting timely status to project leadership.


Required Skills

 

  • 4+ years of experience in cloud migration and assessment
  •  Strong expertise in Azure IaaS/PaaS (VMs, App Services, ADF, etc.)
  •  Familiarity with AWS IaaS/PaaS (EC2, RDS, Glue, S3)
  •  Experience with Java (SpringBoot)/C#, .Net/PythonAngular/React.js, REST APIs
  •  Working knowledge of KafkaDocker/KubernetesAzure DevOps
  •  Network infrastructure understanding (VNets, NSGs, Firewalls, WAFs)
  •  IAM knowledge: OAuth, SAML, Okta/SiteMinder
  •  Experience with Big Data tools like Databricks, Hadoop, Oracle, DocumentDB


Preferred Qualifications

 

  • Azure or AWS certifications
  •  Prior experience with enterprise cloud migrations (especially in Microsoft ecosystem)
  •  Excellent communication and stakeholder management skills


Read more
Techno Comp
shravan c
Posted by shravan c
Pune
6 - 8 yrs
₹5L - ₹9L / yr
ADF
Azure Data Factory
skill iconPython
databricks


Job Title: Developer

Work Location: Pune, MH

Skills Required: Azure Data Factory

Experience Range in Required Skills: 6-8 Years

Job Description: Azure, ADF, Databricks, Python

Essential Skills: Azure, ADF, Databricks, Python

Desirable Skills: Azure, ADF, Databricks, Python

Read more
LearnTube.ai

at LearnTube.ai

2 candid answers
Vidhi Solanki
Posted by Vidhi Solanki
Mumbai
2 - 5 yrs
₹8L - ₹18L / yr
skill iconPython
FastAPI
skill iconAmazon Web Services (AWS)
skill iconMongoDB
CI/CD
+5 more

Role Overview:


As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.


What You'll Do:


At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As a Backend Engineer, your roles and responsibilities will include:

  • Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
  • Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
  • Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
  • Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
  • Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
  • Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
  • Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
  • Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.


What makes you a great fit?


Must-Haves:

  • 2+ yrs Python back-end experience (FastAPI)
  • Strong with Docker & container orchestration
  • Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
  • SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals

Nice-to-Haves

  • k8s at scale, Terraform,
  • Experience with AI/ML inference services (LLMs, vector DBs)
  • Go / Rust for high-perf services
  • Observability: Prometheus, Grafana, OpenTelemetry

About Us: 


At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:

  • AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
  • Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.

Meet the Founders: 


LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.


Why Work With Us? 


At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:

  • Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
  • Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
  • Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
  • Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
  • Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
  • Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.


Read more
Ignite Solutions

at Ignite Solutions

6 recruiters
Eman Khan
Posted by Eman Khan
Remote only
5 - 10 yrs
Best in industry
skill iconPython
skill iconFlask
skill iconDjango
skill iconAmazon Web Services (AWS)
Windows Azure
+5 more

We are looking for a hands-on technical expert who has worked with multiple technology stacks and has experience architecting and building scalable cloud solutions with web and mobile frontends.


What will you work on?

  • Interface with clients
  • Recommend tech stacks
  • Define end-to-end logical and cloud-native architectures
  • Define APIs
  • Integrate with 3rd party systems
  • Create architectural solution prototypes
  • Hands-on coding, team lead, code reviews, and problem-solving


What Makes You A Great Fit?

  • 5+ years of software experience
  • Experience with architecture of technology systems having hands-on expertise in backend, and web or mobile frontend
  • Solid expertise and hands-on experience in Python with Flask or Django
  • Expertise on one or more cloud platforms (AWS, Azure, Google App Engine)
  • Expertise with SQL and NoSQL databases (MySQL, Mongo, ElasticSearch, Redis)
  • Knowledge of DevOps practices
  • Chatbot, Machine Learning, Data Science/Big Data experience will be a plus
  • Excellent communication skills, verbal and written About Us We offer CTO-as-a-service and Product Development for Startups. We value our employees and provide them an intellectually stimulating environment where everyone’s ideas and contributions are valued. 
Read more
Tecblic Private LImited
Ahmedabad
4 - 5 yrs
₹8L - ₹12L / yr
Microsoft Windows Azure
SQL
skill iconPython
PySpark
ETL
+2 more

🚀 We Are Hiring: Data Engineer | 4+ Years Experience 🚀


Job description

🔍 Job Title: Data Engineer

📍 Location: Ahmedabad

🚀 Work Mode: On-Site Opportunity

📅 Experience: 4+ Years

🕒 Employment Type: Full-Time

⏱️ Availability : Immediate Joiner Preferred


Join Our Team as a Data Engineer

We are seeking a passionate and experienced Data Engineer to be a part of our dynamic and forward-thinking team in Ahmedabad. This is an exciting opportunity for someone who thrives on transforming raw data into powerful insights and building scalable, high-performance data infrastructure.

As a Data Engineer, you will work closely with data scientists, analysts, and cross-functional teams to design robust data pipelines, optimize data systems, and enable data-driven decision-making across the organization.


Your Key Responsibilities

Architect, build, and maintain scalable and reliable data pipelines from diverse data sources.

Design effective data storage, retrieval mechanisms, and data models to support analytics and business needs.

Implement data validation, transformation, and quality monitoring processes.

Collaborate with cross-functional teams to deliver impactful, data-driven solutions.

Proactively identify bottlenecks and optimize existing workflows and processes.

Provide guidance and mentorship to junior engineers in the team.


Skills & Expertise We’re Looking For

3+ years of hands-on experience in Data Engineering or related roles.

Strong expertise in Python and data pipeline design.

Experience working with Big Data tools like Hadoop, Spark, Hive.

Proficiency with SQL, NoSQL databases, and data warehousing solutions.

Solid experience in cloud platforms - Azure

Familiar with distributed computing, data modeling, and performance tuning.

Understanding of DevOps, Power Automate, and Microsoft Fabric is a plus.

Strong analytical thinking, collaboration skills, Excellent Communication Skill and the ability to work independently or as part of a team.


Qualifications

Bachelor’s degree in Computer Science, Data Science, or a related field.

Read more
Coimbatore
1 - 6 yrs
₹3.4L - ₹6.5L / yr
skill iconJavascript
skill iconHTML/CSS
skill iconPython
skill iconMongoDB

We are seeking a dedicated and skilled Full Stack Web Development Trainer to deliver high-quality, hands-on training to students and professionals. The ideal candidate will be passionate about teaching and capable of training learners in both frontend and backend technologies while also contributing to live development projects.

Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
6 - 9 yrs
₹15L - ₹25L / yr
skill iconPython
Tableau
skill iconData Analytics
Google Cloud Platform (GCP)
PowerBI
+2 more

Job Overview:


  • JD of DATA ANALYST:



  • Strong proficiency in Python programming.
  • Preferred knowledge of cloud technologies, especially in Google Cloud Platform (GCP).
  • Experience with visualization tools such as Grafana, PowerBI, and Tableau.
  • Good to have knowledge of AI/ML models.
  • Must have extensive knowledge in Python analytics, particularly in exploratory data analysis (EDA).
Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune
3 - 6 yrs
₹14L - ₹20L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
Scikit-Learn
XGBoost
+6 more

Job Overview

  • Level 1-Previous working experience as a Data Scientist minimum 5 years

  • Level 2-Previous working experience as a Data Scientist for 3 to 5 years

  • •In-depth knowledge of Agile process and principles
  • •Outstanding communication, presentation, and leadership skills
  • •Excellent organizational and time management skills
  • •Sharp analytical and problem-solving skills
  • •Creative thinker with a vision
  • •Flexibility / capacity of adaptation
  • •Presentation skills (project reviews with customers and top management)
  • •Interest in industrial & automotive topics
  • •Fluent in English
  • •Ability to work in international teams

  • •Engineering degree with strong background in mathematics and computer science. A PhD in a quantitative field and/or a minimum of 3 years of experience in machine learning is a plus.
  • •Excellent understanding of traditional machine learning techniques and algorithms, such as k-NN, SVM, Random Forests, etc.
  • •Understanding of deep learning techniques
  • •Understanding and, ideally, experience with Reinforcement Learning methods
  • •Experience using ML, DL frameworks (Scikit-learn, XGBoost, TensorFlow, Keras, MXNet, etc.)
  • •Proficiency in at least one programming language (preferably python)
  • •Experience with SQL and NoSQL databases
  • •Excellent verbal and written skills in English is mandatory Engineering degree.

  • Appreciated extra skills
  • •Experience in signal and image processing
  • •Experience in forecasting and time series modeling
  • •Experience with computer vision libraries like OpenCV
  • •Experience using cloud platforms
  • •Experience with versioning control systems (git)
  • •Interest in IoT and hardware adapted to ML tasks


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune
5 - 8 yrs
₹18L - ₹20L / yr
skill iconMachine Learning (ML)
skill iconDeep Learning
Computer Vision
Artificial Intelligence (AI)
skill iconPython
+9 more

Job Overview

  • o Min. 5 years of experience with development in Computer vision, Machine Learning, Deep Learning and associated implementation  of algorithms
  • oKnowledge and experience in 
  • -Data Science/Data Analysis techniques 
  • -Hands on experience of programming in Python, R and MATLAB or Octave
  • -Python Frameworks for AI such as TensorFlow, PySpark, Theano etc. 
  • & libraries like PyTorch, Pandas, Numpy, etc.
  • -Algorithms such as Regression, SVM, Decision tree, KNN and Neural Networks
  • Skills & Attributes: 
  • oFast learner and Problem solving
  • oInnovative thinking
  • oExcellent communication skills 
  • oIntegrity, accountability and transparency 
  • oInternational working mindset 


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
3 - 6 yrs
₹18L - ₹20L / yr
PySpark
skill iconData Science
skill iconPython
NumPy
Generative AI
+8 more

Job Overview : Data scientist (AI/ML)


  • 3 TO 6 years experience in AI/ML 
  • Programming languages: Python, SQL, NoSQL
  • Frameworks: Spark(Pyspark), Scikit-learn, Scipy, Numpy, NLTK
  • DL Frameworks : Tensotflow, Pytorch, LLMs(Transformers/deepseek/ llama), huggingface, llm deployment and inference
  • Gen AI Framework: Langchain
  • Cloud:AWS
  • Tools: Tableau, Grafana

 


  • LLM, GENAI, OCR(optical character recognition)
  •  
  • Notice Period: Immediate to 15 Days
Read more
InvestPulse

at InvestPulse

2 candid answers
1 product
Invest Pulse
Posted by Invest Pulse
Remote only
2 - 5 yrs
₹3L - ₹6L / yr
skill iconPython
Langchaing
CrewAI
skill iconReact.js
skill iconPostgreSQL
+5 more

LendFlow is an AI-powered home loan assessment platform that helps mortgage brokers and lenders save hours by automating document analysis, income validation, and serviceability assessment. We turn complex financial documents into clear insights—fast.

We’re building a smart assistant that ingests client docs (bank statements, payslips, loan summaries) and uses modular AI agents to extract, classify, and summarize financial data in minutes, not hours. Think OCR + AI agents + compliance-ready outputs.


🛠️ What You’ll Be Building

As part of our early technical team, you’ll help us develop and launch our MVP. Key modules include:

  • Document ingestion and OCR processing (Textract, Document AI)
  • AI agent workflows using LangChain or CrewAI
  • Serviceability calculators with business rule engines
  • React + Next.js frontend for brokers and analysts
  • FastAPI backend with PostgreSQL
  • Security, encryption, audit logging (privacy-first design)


🎯 We’re Looking For:

Must-Have Skills:

  • Strong experience with Python (FastAPI, OCR, LLMs, prompt engineering)
  • Familiarity with AI agent frameworks (LangChain, CrewAI, Autogen, or similar)
  • Frontend skills in React.js / Next.js
  • Experience with PostgreSQL and cloud storage (AWS/GCP)
  • Understanding of financial documents and data privacy best practices

Bonus Points:

  • Experience with OCR tools like Amazon Textract, Tesseract, or Document AI
  • Building ML/NLP pipelines in real-world apps
  • Prior work in fintech, lending, or proptech sectors


Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune, Bengaluru (Bangalore), Hyderabad
7 - 12 yrs
₹25L - ₹30L / yr
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
Migration
skill iconPython
AWS services
+3 more

Job Title: Senior Node.js and Python Azure developer ( AWS to Azure Migration expert)

 

Experience: 7-10 Yrs.

 

Primary Skills:

 

Node.js and Python

 

Hands-on experience with Azure, Serverless (Azure Functions)

 

AWS to Azure Cloud Migration (Preferred)

 

 Scope of Work:

 

  • Hand-on experience in migration of Node.js and Python application from AWS to Azure environment
  •  
  • Analyse source architecture, Source code and AWS service dependencies to identify code remediations scenarios.
  •  
  • Perform code remediations/Refactoring and configuration changes required to deploy the application on Azure, including Azure service dependencies and other application dependencies remediations at source code. 
  •  
  • 7+ years of experience in application development with Node.js and Python
  •  
  • Experience in Unit testing, application testing support and troubleshooting on Azure. 
  •  
  • Experience in application deployment scripts/pipelines, App service, APIM, AKS/Microservices/containerized apps, Kubernetes, helm charts. 
  •  
  • Hands-on experience in developing apps for AWS and Azure (Must Have)
  •  
  • Hands-on experience with Azure services for application development (AKS, Azure Functions) and deployments. 
  •  
  • Understanding of Azure infrastructure services required for hosting applications on Azure PaaS or Serverless. 
  •  
  •  Tech stack details:
  •  
  • Confluent Kafka AWS S3 Sync connector
  •  
  • Azure Blob Storage
  •  
  • AWS lambda to Azure Functions (Serverless) – Python or Node.js
  •  
  • NodeJS REST API
  •  
  • S3 to Azure Blob Storage
  •  
  • AWS to Azure SDK Conversion (Must Have)

 

 

Educational qualification:

 

B.E/B.Tech/MCA

 


Read more
Coimbatore
0 - 5 yrs
₹2.5L - ₹7L / yr
skill iconPython
skill iconC++
skill iconHTML/CSS
skill iconJavascript
Big Data
+2 more

A Computer Scientist/Engineer designs, develops, tests, and integrates computer software and hardware systems. This pivotal role blends deep knowledge of computer architecture with advanced software engineering—driving innovation in platforms spanning from embedded systems and networks to AI and cybersecurity

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
4 - 12 yrs
₹3.5L - ₹37L / yr
skill iconPython
AIML

Job Summary:

We are seeking a skilled Python Developer with a strong foundation in Artificial Intelligence and Machine Learning. You will be responsible for designing, developing, and deploying intelligent systems that leverage large datasets and cutting-edge ML algorithms to solve real-world problems.

Key Responsibilities:

  • Design and implement machine learning models using Python and libraries like TensorFlow, PyTorch, or Scikit-learn.
  • Perform data preprocessing, feature engineering, and exploratory data analysis.
  • Develop APIs and integrate ML models into production systems using frameworks like Flask or FastAPI.
  • Collaborate with data scientists, DevOps engineers, and backend teams to deliver scalable AI solutions.
  • Optimize model performance and ensure robustness in real-time environments.
  • Maintain clear documentation of code, models, and processes.

Required Skills:

  • Proficiency in Python and ML libraries (NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch).
  • Strong understanding of ML algorithms (classification, regression, clustering, deep learning).
  • Experience with data pipeline tools (e.g., Airflow, Spark) and cloud platforms (AWS, Azure, or GCP).
  • Familiarity with containerization (Docker, Kubernetes) and CI/CD practices.
  • Solid grasp of RESTful API development and integration.

Preferred Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Science, or related field.
  • 2–5 years of experience in Python development with a focus on AI/ML.
  • Exposure to MLOps practices and model monitoring tools.


Read more
Certa

at Certa

1 video
4 recruiters
Vibhavari Muppavaram
Posted by Vibhavari Muppavaram
Remote only
3 - 5 yrs
₹10L - ₹20L / yr
JSON
skill iconJava
skill iconJavascript
RESTful APIs
skill iconPython
+2 more

About Us: Certa is an emerging leader in the fast-growing Enterprise Workflow Automation industry with advanced “no-code” SaaS workflow solutions. Our platform addresses the entire lifecycle for Suppliers/Third-parties/Partners covering onboarding, risk assessment, contract lifecycle management, and ongoing monitoring. Certa offers the most automated and "ridiculously" configurable solutions disrupting the customer/counterparty KYC & AML space. The Certa platform brings business functions like Procurement, Sales, Compliance, Legal, InfoSec, Privacy, etc., together via an easy collaborative workflow, automated risk scoring, and ongoing monitoring for key ‘shifts in circumstance’. Our data-agnostic, open-API platform ensures that Clients can take a best-in-class approach when leveraging any of our 80+ (& growing) existing data and tech partner integrations. As a result, Certa enables clients to onboard Third Parties & KYC customers faster, with less effort, with no swivel chair syndrome and maintains a constantly updated searchable knowledge repository of all records.

Certa’s clients range from the largest & leading global firms in their Industry (Retail, Aerospace, Payments, Consulting, Ridesharing, and Commercial Data) to mid-stage start-ups.


Responsibilities:

As a Solutions Engineer at our technology product company, you will play a critical role in ensuring the successful integration and customisation of our product offerings for clients. Your primary responsibilities will involve configuring our software solutions to meet our client's unique requirements and business use cases. Additionally, you will be heavily involved in API integrations to enable seamless data flow and connectivity between our products and various client systems.

  • Client Requirement Analysis: Collaborate with the sales and client-facing teams to understand client needs, business use cases, and specific requirements for implementing our technology products.
  • Product Configuration: Utilize your technical expertise to configure and customise our software solutions according to the identified client needs and business use cases. This may involve setting up workflows, defining data structures, and enabling specific features or functionalities.
  • API Integration: Work closely with the development and engineering teams to design, implement, and manage API integrations with external systems, ensuring smooth data exchange and interoperability.
  • Solution Design: Participate in solution design discussions with clients and internal stakeholders, providing valuable insights and recommendations based on your understanding of the technology and the business domain.
  • Troubleshooting: Identify and resolve configuration-related issues and challenges that arise during the implementation and integration process, ensuring the smooth functioning of the product.
  • Documentation: Create and maintain detailed documentation of configurations, customisations, and integration processes to facilitate knowledge sharing within the organisation and with clients.
  • Quality Assurance: Conduct thorough unit testing of configurations and integrations to verify that they meet the defined requirements and perform as expected.
  • Client Support: Provide support and guidance to clients during the onboarding and post-implementation phases, assisting them with any questions or concerns related to configuration and integration.
  • Continuous Improvement: Stay up-to-date with the latest product features, industry trends, and best practices in configuration and integration, and proactively suggest improvements to enhance the overall efficiency and effectiveness of the process.
  • Cross-Functional Collaboration: Work closely with different teams, including product management, engineering, marketing, and sales, to align product development with business goals and customer needs.
  • Product Launch and Support*: Assist in the product launch by providing technical support, conducting training sessions, and addressing customer inquiries. Collaborate with customer support teams to troubleshoot and resolve complex technical issues.


Requirements :

  • 3 - 5 Years in a similar capacity with a proven track record of Implementation excellence working with Medium to large enterprise customers
  • Strong analytical skills with the ability to grasp complex business use cases and translate them into technical solutions.
  • Bachelor’s Degree Required with a preference for Engineering or equivalent.
  • Practical experience working on ERP integrations, process documentation and requirements-gathering tools like MIRO or VISIO is a plus.
  • Proficiency in API integration and understanding of RESTful APIs and web services.
  • Technical expertise in relevant programming languages and platforms related to the technology product.
  • Exceptional communication skills to interact with clients, understand their requirements, and explain technical concepts clearly and concisely.
  • Results-oriented and inherently curious mindset capable of influencing internal and external partners to drive priorities and outcomes.
  • Independent operator capable of taking limited direction and applying the best action.
  • Excellent communication, presentation, negotiation, and interpersonal skills.
  • Ability to create structure in ambiguous situations and design effective processes.
  • Experience with JSON and SaaS Products is a plus.
  • Location: Hires Remotely Everywhere
  • Job Type: Full Time
  • Experience: 3 - 5 years
  • Languages: Excellent command of the English Language


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Rutuja Patil
Posted by Rutuja Patil
Mumbai
4 - 10 yrs
Best in industry
skill iconJava
J2EE
Hibernate (Java)
skill iconSpring Boot
Spring MVC
+2 more

Company Name – Wissen Technology

Group of companies in India – Wissen Technology & Wissen Infotech

Work Location - Senior Backend Developer – Java (with Python Exposure)- Mumbai


Experience - 4 to 10 years


Kindly revert over mail if you are interested.


Java Developer – Job Description


We are seeking a Senior Backend Developer with strong expertise in Java (Spring Boot) and working knowledge of Python. In this role, Java will be your primary development language, with Python used for scripting, automation, or selected service modules. You’ll be part of a collaborative backend team building scalable and high-performance systems.


Key Responsibilities


  • Design and develop robust backend services and APIs primarily using Java (Spring Boot)
  • Contribute to Python-based components where needed for automation, scripting, or lightweight services
  • Build, integrate, and optimize RESTful APIs and microservices
  • Work with relational and NoSQL databases
  • Write unit and integration tests (JUnit, PyTest)
  • Collaborate closely with DevOps, QA, and product teams
  • Participate in architecture reviews and design discussions
  • Help maintain code quality, organization, and automation


Required Skills & Qualifications

  • 4 to 10 years of hands-on Java development experience
  • Strong experience with Spring Boot, JPA/Hibernate, and REST APIs
  • At least 1–2 years of hands-on experience with Python (e.g., for scripting, automation, or small services)
  • Familiarity with Python frameworks like Flask or FastAPI is a plus
  • Experience with SQL/NoSQL databases (e.g., PostgreSQL, MongoDB)
  • Good understanding of OOPdesign patterns, and software engineering best practices
  • Familiarity with DockerGit, and CI/CD pipelines


Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
8yrs+
Upto ₹30L / yr (Varies
)
Web Scraping
skill iconPython
Selenium
skill iconHTML/CSS
XPath
+2 more

About the Company

Hypersonix.ai is disrupting the e-commerce space with AI, ML, and advanced decision-making capabilities to drive real-time business insights. Built from the ground up using modern technologies, Hypersonix simplifies data consumption for customers across various industry verticals. We are seeking a well-rounded, hands-on product leader to help manage key capabilities and features in our platform.


Position Overview

We are seeking a highly skilled Web Scraping Architect to join our team. The successful candidate will be responsible for designing, implementing, and maintaining web scraping processes to gather data from various online sources efficiently and accurately. As a Web Scraping Specialist, you will play a crucial role in collecting data for competitor analysis and other business intelligence purposes.


Responsibilities

  • Scalability/Performance: Lead and provide expertise in scraping at scale e-commerce marketplaces.
  • Data Source Identification: Identify relevant websites and online sources from which data needs to be scraped. Collaborate with the team to understand data requirements and objectives.
  • Web Scraping Design: Develop and implement effective web scraping strategies to extract data from targeted websites. This includes selecting appropriate tools, libraries, or frameworks for the task.
  • Data Extraction: Create and maintain web scraping scripts or programs to extract the required data. Ensure the code is optimized, reliable, and can handle changes in the website's structure.
  • Data Cleansing and Validation: Cleanse and validate the collected data to eliminate errors, inconsistencies, and duplicates. Ensure data integrity and accuracy throughout the process.
  • Monitoring and Maintenance: Continuously monitor and maintain the web scraping processes. Address any issues that arise due to website changes, data format modifications, or anti-scraping mechanisms.
  • Scalability and Performance: Optimize web scraping procedures for efficiency and scalability, especially when dealing with a large volume of data or multiple data sources.
  • Compliance and Legal Considerations: Stay up-to-date with legal and ethical considerations related to web scraping, including website terms of service, copyright, and privacy regulations.
  • Documentation: Maintain detailed documentation of web scraping processes, data sources, and methodologies. Create clear and concise instructions for others to follow.
  • Collaboration: Collaborate with other teams such as data analysts, developers, and business stakeholders to understand data requirements and deliver insights effectively.
  • Security: Implement security measures to ensure the confidentiality and protection of sensitive data throughout the scraping process.


Requirements

  • Proven experience of 7+ years as a Web Scraping Specialist or similar role, with a track record of successful web scraping projects
  • Expertise in handling dynamic content, user-agent rotation, bypassing CAPTCHAs, rate limits, and use of proxy services
  • Knowledge of browser fingerprinting
  • Has leadership experience
  • Proficiency in programming languages commonly used for web scraping, such as Python, BeautifulSoup, Scrapy, or Selenium
  • Strong knowledge of HTML, CSS, XPath, and other web technologies relevant to web scraping and coding
  • Knowledge and experience in best-of-class data storage and retrieval for large volumes of scraped data
  • Understanding of web scraping best practices, including handling dynamic content, user-agent rotation, and IP address management
  • Attention to detail and ability to handle and process large volumes of data accurately
  • Familiarity with data cleansing techniques and data validation processes
  • Good communication skills and ability to collaborate effectively with cross-functional teams
  • Knowledge of web scraping ethics, legal considerations, and compliance with website terms of service
  • Strong problem-solving skills and adaptability to changing web environments


Preferred Qualifications

  • Bachelor’s degree in Computer Science, Data Science, Information Technology, or related fields
  • Experience with cloud-based solutions and distributed web scraping systems
  • Familiarity with APIs and data extraction from non-public sources
  • Knowledge of machine learning techniques for data extraction and natural language processing is desired but not mandatory
  • Prior experience in handling large-scale data projects and working with big data frameworks
  • Understanding of various data formats such as JSON, XML, CSV, etc.
  • Experience with version control systems like Git
Read more
DEMAND MEDIA BPM LLP

at DEMAND MEDIA BPM LLP

2 candid answers
Darshana Mate
Posted by Darshana Mate
Pune
1 - 5 yrs
₹2L - ₹6L / yr
SQL
PowerBI
skill iconPython

Job Purpose

Responsible for managing end-to-end database operations, ensuring data accuracy, integrity, and security across systems. The position plays a key role in driving data reliability, availability, and compliance with operational standards.


Key Responsibilities:

  • Collate audit reports from the QA team and structure data in accordance with Standard Operating Procedures (SOP).
  • Perform data transformation and validation for accuracy and consistency.
  • Upload processed datasets into SQL Server using SSIS packages.
  • Monitor and optimize database performance, identifying and resolving bottlenecks.
  • Perform regular backups, restorations, and recovery checks to ensure data continuity.
  • Manage user access and implement robust database security policies.
  • Oversee database storage allocation and utilization.
  • Conduct routine maintenance and support incident management, including root cause analysis and resolution.
  • Design and implement scalable database solutions and architecture.
  • Create and maintain stored procedures, views, and other database components.
  • Optimize SQL queries for performance and scalability.
  • Execute ETL processes and support seamless integration of multiple data sources.
  • Maintain data integrity and quality through validation and cleansing routines.
  • Collaborate with cross-functional teams on data solutions and project deliverables.

 

Educational Qualification: Any Graduate

Required Skills & Qualifications:

  • Proven experience with SQL Server or similar relational database platforms.
  • Strong expertise in SSIS, ETL processes, and data warehousing.
  • Proficiency in SQL/T-SQL, including scripting, performance tuning, and query optimization.
  • Experience in database security, user role management, and access control.
  • Familiarity with backup/recovery strategies and database maintenance best practices.
  • Strong analytical skills with experience working with large and complex datasets.
  • Solid understanding of data modeling, normalization, and schema design.
  • Knowledge of incident and change management processes.
  • Excellent communication and collaboration skills.
  • Experience with Python for data manipulation and automation is a strong plus.


Read more
kanhasoft
Shreya Mehta
Posted by Shreya Mehta
Ahmedabad
5 - 12 yrs
₹5L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconFlask

Job Description :



-5+ years experience needed

-python,django framework

-Ajax, jquery, web services knowledge

-angular js would be added advantage but not compulsory

-Good knowledge of MySQL or MongoDB or Postgresql

-Added advantage if experience working with angular JS, Scrapping scripts etc

-Good english communication skill

Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
7yrs+
Upto ₹40L / yr (Varies
)
SQL
skill iconPython
ETL
Data engineering
Big Data
+2 more

About the Company

Hypersonix.ai is disrupting the e-commerce space with AI, ML and advanced decision capabilities to drive real-time business insights. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in various industry verticals. Hypersonix.ai is seeking a well-rounded, hands-on product leader to help lead product management of key capabilities and features.


About the Role

We are looking for talented and driven Data Engineers at various levels to work with customers to build the data warehouse, analytical dashboards and ML capabilities as per customer needs.


Roles and Responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements; should write complex queries in an optimized way
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Run ad-hoc analysis utilizing the data pipeline to provide actionable insights
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
  • Work with analytics and data scientist team members and assist them in building and optimizing our product into an innovative industry leader


Requirements

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • A successful history of manipulating, processing and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • We are looking for a candidate with 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Information Technology or completed MCA.
Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Hyderabad
10 - 20 yrs
₹25L - ₹35L / yr
skill icon.NET
skill iconC#
skill iconReact.js
TypeScript
skill iconHTML/CSS
+18 more

About the Role:

We are seeking a Technical Architect with proven expertise in full-stack web development, cloud infrastructure, and system design. You will lead the design and delivery of scalable enterprise applications, drive technical decision-making, and mentor a cross-functional development team. The ideal candidate has a strong foundation in .NET-based architecture, modern front-end frameworks, and cloud-native technologies.


Key Responsibilities:

  • Lead the technical architecture, system design, and full-stack development of enterprise-grade web applications.
  • Design and develop robust backend systems and APIs using .NET Core / C# / Python, following TDD/BDD principles.
  • Build modern frontends using React.js, TypeScript, and optionally Angular, ensuring responsive and accessible UI.
  • Architect scalable, secure, and highly available solutions using cloud platforms such as Azure, AWS, or GCP.
  • Guide and review CI/CD pipeline creation and DevOps practices, leveraging tools like Azure DevOps, Git, Docker, etc.
  • Oversee database design and optimization for relational and NoSQL systems like MSSQL, PostgreSQL, MongoDB, CosmosDB.
  • Mentor developers and collaborate with cross-functional teams including Product Owners, QA, and DevOps.
  • Ensure best practices in code quality, security, performance, and compliance.
  • Lead application monitoring, error tracking, and infrastructure tuning for production-grade deployments.
  • Required Skills:
  • 10+ years of experience in software development, with 3+ years in architectural or technical leadership roles.
  • Strong expertise in .NET Core, C#, React.js, TypeScript, HTML5, CSS3, and JavaScript.
  • Good exposure to Python for backend services or data pipelines.
  • Cloud platform experience in at least one or more: Azure, AWS, or Google Cloud Platform (GCP).
  • Proficient in designing and consuming RESTful APIs, and working with metadata-driven and microservices architecture.
  • Strong understanding of DevOps, CI/CD, and deployment strategies using tools like Git, Docker, Azure DevOps.
  • Familiarity with frontend frameworks like Angular or Vue.js is a plus.
  • Proficient with databases: MSSQL, PostgreSQL, MySQL, MongoDB, CosmosDB.
  • Comfortable working on Linux/UNIX and Windows-based servers, along with web servers like Nginx, Apache, IIS.
  • Good to Have:
  • Experience in CRM, ERP, or E-commerce platforms.
  • Familiarity with AI/ML integration and working with data science teams.
  • Exposure to mobile development using React Native.
  • Experience integrating third-party tools like Slack, Microsoft Teams, etc.
  • Soft Skills:
  • Strong problem-solving mindset with a proactive and innovative approach.
  • Excellent communication and leadership abilities.
  • Capability to mentor junior engineers and drive a high-performance team culture.
  • Adaptability to work in fast-paced, Agile environments.


Educational Qualifications:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related technical discipline.
  • Microsoft / Cloud certifications are a plus.
Read more
Pace Wisdom Solutions
Bengaluru (Bangalore)
2 - 5 yrs
₹5L - ₹12L / yr
Odoo (OpenERP)
skill iconPython
skill iconJavascript
skill iconHTML/CSS

Location: Bengaluru/Mangaluru 

Experience required: 2-5 years 

Key skills:  Odoo Development, Python, Frontend Technologies 

Designation: SE L1/L2/L3/ ATL 

 

Job Summary:  

We are seeking a skilled and proactive Odoo Developer to join our dynamic team. The ideal candidate will have hands-on experience in customizing, developing, and maintaining Odoo modules, with a deep understanding of Python and business processes. You will play a key role in requirement gathering, technical design, development, testing, and deployment.  


Key Responsibilities:  

  • Develop, customize, and maintain Odoo modules as per business requirements.  
  • Analyze, design, and develop new modules and features in Odoo ERP.  
  • Troubleshoot, debug, and upgrade existing Odoo modules.  
  • Integrate Odoo with third-party platforms using APIs/web services.  
  • Provide technical support and training to end-users.  
  • Collaborate with functional consultants and stakeholders to gather requirements and deliver scalable ERP solutions.  
  • Write clean, reusable, and efficient Python code and maintain technical documentation.  


Required Skills & Qualifications:  

  • 2-5 years of proven experience as an Odoo Developer.  
  • Strong knowledge of Python, PostgreSQL, and Odoo framework (ORM, QWeb, XML).  
  • Experience in Odoo custom module development and Odoo standard modules   
  • Good understanding of Odoo backend and frontend (JavaScript, HTML, CSS).  
  • Experience with Odoo APIs and web services (REST/SOAP).  
  • Familiarity with Linux environments, Git version control.  
  • Ability to work independently and in a team with minimal supervision.  
  • Good analytical and problem-solving skills.  
  • Strong verbal and written communication skills. Knowledge of Odoo deployment (Linux, Docker, Nginx, Odoo.sh) is a plus 

 

About the Company:   


Pace Wisdom Solutions is a deep-tech Product engineering and consulting firm. We have offices in San Francisco, Bengaluru, and Singapore. We specialize in designing and developing bespoke software solutions that cater to solving niche business problems.  


We engage with our clients at various stages:  


  • Right from the idea stage to scope out business requirements.  
  • Design & architect the right solution and define tangible milestones.  
  • Setup dedicated and on-demand tech teams for agile delivery.  
  • Take accountability for successful deployments to ensure efficient go-to-market Implementations. 
Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Pune, Hyderabad
3 - 7 yrs
₹11L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconPython
skill icongrafana
AWS CloudFormation
Terraform
+4 more

We are seeking a highly skilled and motivated MLOps Engineer with 3-5 years of experience to join our engineering team. The ideal candidate should possess a strong foundation in DevOps or software engineering principles with practical exposure to machine learning operational workflows. You will be instrumental in operationalizing ML systems, optimizing the deployment lifecycle, and strengthening the integration between data science and engineering teams.

Required Skills:

• Hands-on experience with MLOps platforms such as MLflow and Kubeflow.

• Proficiency in Infrastructure as Code (laC) tools like Terraform or Ansible.

• Strong familiarity with monitoring and alerting frameworks (Prometheus, Grafana, Datadog, AWS CloudWatch).

• Solid understanding of microservices architecture, service discovery, and load balancing.

• Excellent programming skills in Python, with experience in writing modular, testable, and maintainable code.

• Proficient in Docker and container-based application deployments.

• Experience with CI/CD tools such as Jenkins or GitLab Cl.

• Basic working knowledge of Kubernetes for container orchestration.

• Practical experience with cloud-based ML platforms such as AWS SageMaker, Databricks, or Google Vertex Al.



Good-to-Have Skills:

• Awareness of security practices specific to ML pipelines, including secure model endpoints and data protection.

• Experience with scripting languages like Bash or PowerShell for automation tasks.

• Exposure to database scripting and data integration pipelines.

Experience & Qualifications:

• 3-5+ years of experience in MLOps, Site Reliability Engineering (SRE), or

Software Engineering roles.

• At least 2+ years of hands-on experience working on ML/Al systems in production settings.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort