Cutshort logo

50+ pandas Jobs in India

Apply to 50+ pandas Jobs on CutShort.io. Find your next job, effortlessly. Browse pandas Jobs and apply today!

icon
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Indore
0 - 2 yrs
₹6L - ₹12L / yr
skill iconPython
skill iconMachine Learning (ML)
pandas
NumPy
Blockchain
+1 more

About Us

Alfred Capital - Alfred Capital is a next-generation on-chain proprietary quantitative trading technology provider, pioneering fully autonomous algorithmic systems that reshape trading and capital allocation in decentralized finance. 


As a sister company of Deqode — a 400+ person blockchain innovation powerhouse — we operate at the cutting edge of quant research, distributed infrastructure, and high-frequency execution.


What We Build

  • Alpha Discovery via On‑Chain Intelligence — Developing trading signals using blockchain data, CEX/DEX markets, and protocol mechanics.
  • DeFi-Native Execution Agents — Automated systems that execute trades across decentralized platforms.
  • ML-Augmented Infrastructure — Machine learning pipelines for real-time prediction, execution heuristics, and anomaly detection.
  • High-Throughput Systems — Resilient, low-latency engines that operate 24/7 across EVM and non-EVM chains tuned for high-frequency trading (HFT) and real-time response
  • Data-Driven MEV Analysis & Strategy — We analyze mempools, order flow, and validator behaviors to identify and capture MEV opportunities ethically—powering strategies that interact deeply with the mechanics of block production and inclusion.


Evaluation Process

  • HR Discussion – A brief conversation to understand your motivation and alignment with the role.
  • Initial Technical Interview – A quick round focused on fundamentals and problem-solving approach.
  • Take-Home Assignment – Assesses research ability, learning agility, and structured thinking.
  • Assignment Presentation – Deep-dive into your solution, design choices, and technical reasoning.
  • Final Interview – A concluding round to explore your background, interests, and team fit in depth.
  • Optional Interview – In specific cases, an additional round may be scheduled to clarify certain aspects or conduct further assessment before making a final decision.


Job Description : Blockchain Data & ML Engineer


As a Blockchain Data & ML Engineer, you’ll work on ingesting and modelling on-chain behaviour, building scalable data pipelines, and designing systems that support intelligent, autonomous market interaction.


What You’ll Work On

  • Build and maintain ETL pipelines for ingesting and processing blockchain data.
  • Assist in designing, training, and validating machine learning models for prediction and anomaly detection.
  • Evaluate model performance, tune hyperparameters, and document experimental results.
  • Develop monitoring tools to track model accuracy, data drift, and system health.
  • Collaborate with infrastructure and execution teams to integrate ML components into production systems.
  • Design and maintain databases and storage systems to efficiently manage large-scale datasets.


Ideal Traits

  • Strong in data structures, algorithms, and core CS fundamentals.
  • Proficiency in any programming language
  • Familiarity with backend systems, APIs, and database design, along with a basic    understanding of machine learning and blockchain fundamentals.
  • Curiosity about how blockchain systems and crypto markets work under the hood.
  • Self-motivated, eager to experiment and learn in a dynamic environment.


Bonus Points For

  • Hands-on experience with pandas, numpy, scikit-learn, or PyTorch.
  • Side projects involving automated ML workflows, ETL pipelines, or crypto protocols.
  • Participation in hackathons or open-source contributions.


What You’ll Gain

  • Cutting-Edge Tech Stack: You'll work on modern infrastructure and stay up to date with the latest trends in technology.
  • Idea-Driven Culture: We welcome and encourage fresh ideas. Your input is valued, and you're empowered to make an impact from day one.
  • Ownership & Autonomy: You’ll have end-to-end ownership of projects. We trust our team and give them the freedom to make meaningful decisions.
  • Impact-Focused: Your work won’t be buried under bureaucracy. You’ll see it go live and make a difference in days, not quarters


What We Value:

  • Craftsmanship over shortcuts: We appreciate engineers who take the time to understand the problem deeply and build durable solutions—not just quick fixes.
  • Depth over haste: If you're the kind of person who enjoys going one level deeper to really "get" how something works, you'll thrive here.
  • Invested mindset: We're looking for people who don't just punch tickets, but care about the long-term success of the systems they build.
  • Curiosity with follow-through: We admire those who take the time to explore and validate new ideas, not just skim the surface.

Compensation:

  • INR 6 - 12 LPA
  • Performance Bonuses: Linked to contribution, delivery, and impact.



Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Mumbai, Hyderabad, Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹6L - ₹25L / yr
skill iconPython
skill iconDjango
NumPy
skill iconFlask
pandas
+1 more

Python Developer Job Description

A Python Developer is responsible for designing, developing, and deploying software applications using the Python programming language. Here's a brief overview:


Key Responsibilities

- Software Development: Develop high-quality software applications using Python.

- Problem-Solving: Solve complex problems using Python programming language.

- Code Maintenance: Maintain and update existing codebases to ensure they remain efficient and scalable.

- Collaboration: Collaborate with cross-functional teams to identify and prioritize project requirements.

- Testing and Debugging: Write unit tests and debug applications to ensure high-quality code.


Technical Skills

- Python: Strong understanding of Python programming language and its ecosystem.

- Programming Fundamentals: Knowledge of programming fundamentals, including data structures, algorithms, and object-oriented programming.

- Frameworks and Libraries: Familiarity with popular Python frameworks and libraries, such as Django, Flask, or Pandas.

- Database Management: Understanding of database management systems, including relational databases and NoSQL databases.

- Version Control: Knowledge of version control systems, including Git.


Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Indore
0 - 2 yrs
₹6L - ₹12L / yr
Blockchain
ETL
Artificial Intelligence (AI)
Generative AI
skill iconPython
+3 more

About Us

Alfred Capital - Alfred Capital is a next-generation on-chain proprietary quantitative trading technology provider, pioneering fully autonomous algorithmic systems that reshape trading and capital allocation in decentralized finance. 


As a sister company of Deqode — a 400+ person blockchain innovation powerhouse — we operate at the cutting edge of quant research, distributed infrastructure, and high-frequency execution.


What We Build

  • Alpha Discovery via On‑Chain Intelligence — Developing trading signals using blockchain data, CEX/DEX markets, and protocol mechanics.
  • DeFi-Native Execution Agents — Automated systems that execute trades across decentralized platforms.
  • ML-Augmented Infrastructure — Machine learning pipelines for real-time prediction, execution heuristics, and anomaly detection.
  • High-Throughput Systems — Resilient, low-latency engines that operate 24/7 across EVM and non-EVM chains tuned for high-frequency trading (HFT) and real-time response
  • Data-Driven MEV Analysis & Strategy — We analyze mempools, order flow, and validator behaviors to identify and capture MEV opportunities ethically—powering strategies that interact deeply with the mechanics of block production and inclusion.


Evaluation Process

  • HR Discussion – A brief conversation to understand your motivation and alignment with the role.
  • Initial Technical Interview – A quick round focused on fundamentals and problem-solving approach.
  • Take-Home Assignment – Assesses research ability, learning agility, and structured thinking.
  • Assignment Presentation – Deep-dive into your solution, design choices, and technical reasoning.
  • Final Interview – A concluding round to explore your background, interests, and team fit in depth.
  • Optional Interview – In specific cases, an additional round may be scheduled to clarify certain aspects or conduct further assessment before making a final decision.


Blockchain Data & ML Engineer


As a Blockchain Data & ML Engineer, you’ll work on ingesting and modeling on-chain behavior, building scalable data pipelines, and designing systems that support intelligent, autonomous market interaction.


What You’ll Work On

  • Build and maintain ETL pipelines for ingesting and processing blockchain data.
  • Assist in designing, training, and validating machine learning models for prediction and anomaly detection.
  • Evaluate model performance, tune hyperparameters, and document experimental results.
  • Develop monitoring tools to track model accuracy, data drift, and system health.
  • Collaborate with infrastructure and execution teams to integrate ML components into production systems.
  • Design and maintain databases and storage systems to efficiently manage large-scale datasets.


Ideal Traits

  • Strong in data structures, algorithms, and core CS fundamentals.
  • Proficiency in any programming language
  • Curiosity about how blockchain systems and crypto markets work under the hood.
  • Self-motivated, eager to experiment and learn in a dynamic environment.


Bonus Points For

  • Hands-on experience with pandas, numpy, scikit-learn, or PyTorch.
  • Side projects involving automated ML workflows, ETL pipelines, or crypto protocols.
  • Participation in hackathons or open-source contributions.


What You’ll Gain

  • Cutting-Edge Tech Stack: You'll work on modern infrastructure and stay up to date with the latest trends in technology.
  • Idea-Driven Culture: We welcome and encourage fresh ideas. Your input is valued, and you're empowered to make an impact from day one.
  • Ownership & Autonomy: You’ll have end-to-end ownership of projects. We trust our team and give them the freedom to make meaningful decisions.
  • Impact-Focused: Your work won’t be buried under bureaucracy. You’ll see it go live and make a difference in days, not quarters


What We Value:

  • Craftsmanship over shortcuts: We appreciate engineers who take the time to understand the problem deeply and build durable solutions—not just quick fixes.
  • Depth over haste: If you're the kind of person who enjoys going one level deeper to really "get" how something works, you'll thrive here.
  • Invested mindset: We're looking for people who don't just punch tickets, but care about the long-term success of the systems they build.
  • Curiosity with follow-through: We admire those who take the time to explore and validate new ideas, not just skim the surface.


Compensation:

  • INR 6 - 12 LPA
  • Performance Bonuses: Linked to contribution, delivery, and impact.
Read more
TalentLo

at TalentLo

2 candid answers
Satyansh A
Posted by Satyansh A
Remote only
0 - 2 yrs
₹1L - ₹1L / yr
NumPy
pandas
skill iconPython
Scikit-Learn

Required Skills:

•           Basic understanding of machine learning concepts and algorithms

•           Proficiency in Python and relevant libraries (NumPy, Pandas, scikit-learn)

•           Familiarity with data preprocessing techniques

•           Knowledge of basic statistical concepts

•           Understanding of model evaluation metrics

•           Basic experience with at least one deep learning framework (TensorFlow, PyTorch)

•           Strong analytical and problem-solving abilities

 

 

Application Process: Create your profile on our platform, submit your portfolio, GitHub profile, or sample projects.

https://www.talentlo.com/

Read more
HaystackAnalytics
Careers Hr
Posted by Careers Hr
Navi Mumbai
1 - 4 yrs
₹6L - ₹12L / yr
skill iconRust
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconData Science
+2 more

Position – Python Developer

Location – Navi Mumbai


Who are we

Based out of IIT Bombay, HaystackAnalytics is a HealthTech company creating clinical genomics products, which enable diagnostic labs and hospitals to offer accurate and personalized diagnostics. Supported by India's most respected science agencies (DST, BIRAC, DBT), we created and launched a portfolio of products to offer genomics in infectious diseases. Our genomics-based diagnostic solution for Tuberculosis was recognized as one of the top innovations supported by BIRAC in the past 10 years, and was launched by the Prime Minister of India in the BIRAC Showcase event in Delhi, 2022.


Objectives of this Role:

  • Design and implement efficient, scalable backend services using Python.
  • Work closely with healthcare domain experts to create innovative and accurate diagnostics solutions.
  • Build APIs, services, and scripts to support data processing pipelines and front-end applications.
  • Automate recurring tasks and ensure robust integration with cloud services.
  • Maintain high standards of software quality and performance using clean coding principles and testing practices.
  • Collaborate within the team to upskill and unblock each other for faster and better outcomes.





Primary Skills – Python Development

  • Proficient in Python 3 and its ecosystem
  • Frameworks: Flask / Django / FastAPI
  • RESTful API development
  • Understanding of OOPs and SOLID design principles
  • Asynchronous programming (asyncio, aiohttp)
  • Experience with task queues (Celery, RQ)
  • Rust programming experience for systems-level or performance-critical components

Testing & Automation

  • Unit Testing: PyTest / unittest
  • Automation tools: Ansible / Terraform (good to have)
  • CI/CD pipelines

DevOps & Cloud

  • Docker, Kubernetes (basic knowledge expected)
  • Cloud platforms: AWS / Azure / GCP
  • GIT and GitOps workflows
  • Familiarity with containerized deployment & serverless architecture

Bonus Skills

  • Data handling libraries: Pandas / NumPy
  • Experience with scripting: Bash / PowerShell
  • Functional programming concepts
  • Familiarity with front-end integration (REST API usage, JSON handling)

 Other Skills

  • Innovation and thought leadership
  • Interest in learning new tools, languages, workflows
  • Strong communication and collaboration skills
  • Basic understanding of UI/UX principles


To know more about ushttps://haystackanalytics.in




Read more
IT Company

IT Company

Agency job
via Jobdost by Saida Jabbar
Pune
5 - 8 yrs
₹18L - ₹20L / yr
skill iconMachine Learning (ML)
skill iconDeep Learning
Computer Vision
Artificial Intelligence (AI)
skill iconPython
+9 more

Job Overview

  • o Min. 5 years of experience with development in Computer vision, Machine Learning, Deep Learning and associated implementation  of algorithms
  • oKnowledge and experience in 
  • -Data Science/Data Analysis techniques 
  • -Hands on experience of programming in Python, R and MATLAB or Octave
  • -Python Frameworks for AI such as TensorFlow, PySpark, Theano etc. 
  • & libraries like PyTorch, Pandas, Numpy, etc.
  • -Algorithms such as Regression, SVM, Decision tree, KNN and Neural Networks
  • Skills & Attributes: 
  • oFast learner and Problem solving
  • oInnovative thinking
  • oExcellent communication skills 
  • oIntegrity, accountability and transparency 
  • oInternational working mindset 


Read more
Intellikart Ventures LLP
ramandeep intellikart
Posted by ramandeep intellikart
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹25L / yr
Langchaing
langgraph
Linux kernel
LLMs
Prompt engineering
+3 more

Job Summary:

We are hiring a Data Scientist – Gen AI with hands-on experience in developing Agentic AI applications using frameworks like LangChain, LangGraph, Semantic Kernel, or Microsoft Copilot. The ideal candidate will be proficient in Python, LLMs, and prompt engineering techniques such as RAG and Chain-of-Thought prompting.


Key Responsibilities:

  • Build and deploy Agent AI applications using LLM frameworks.
  • Apply advanced prompt engineering (Zero-Shot, Few-Shot, CoT).
  • Integrate Retrieval-Augmented Generation (RAG).
  • Develop scalable solutions in Python using NumPy, Pandas, TensorFlow/PyTorch.
  • Collaborate with teams to deliver business-aligned Gen AI solutions.


Must-Have Skills:

  • Experience with LangChain, LangGraph, or similar (priority given).
  • Strong understanding of LLMs, RAG, and prompt engineering.
  • Proficiency in Python and relevant ML libraries.


Nice-to-Have:

  • Wrapper API development for LLMs.
  • REST API integration within Agentic workflows.


Qualifications:

  • Bachelor’s/Master’s in CS, Data Science, AI, or related.
  • 4–7 years in AI/ML/Data Science, with 1–2 years in Gen AI/LLMs.
Read more
GoQuest Media Ventures Pvt Ltd
Mumbai
1 - 5 yrs
₹8L - ₹10L / yr
MERN Stack
Fullstack Developer
skill iconPython
Mobile App Development
Web Development
+7 more

ROLES AND RESPONSIBILITIES


As a Full Stack Developer at GoQuest Media, you will play a key role in building and maintaining

web applications that deliver seamless user experiences for our global clients. From

brainstorming features with the team to executing back-end logic, you will be involved in every

aspect of our application development process.

You will be working with modern technologies like NodeJS, ReactJS, NextJS, and Tailwind CSS

to create performant, scalable applications. Your role will span both front-end and back-end

development as you build efficient and dynamic solutions to meet the company’s and users’

needs.


What will you be accountable for?

● End-to-End Development:

● Design and develop highly scalable and interactive web applications from scratch.

● Take ownership of both front-end (ReactJS, NextJS, Tailwind CSS) and back-end

(NodeJS) development processes.

● Feature Implementation:

● Work closely with designers and product managers to translate ideas into highly

interactive and responsive interfaces.

● Maintenance and Debugging:

● Ensure applications are optimized for performance, scalability, and reliability.

● Perform regular maintenance, debugging, and testing of existing apps to ensure

they remain in top shape.

● Collaboration:

● Collaborate with cross-functional teams, including designers, product managers,

and stakeholders, to deliver seamless and robust applications.

● Innovation:

● Stay updated with the latest trends and technologies to suggest and implement

improvements in the development process.


Tech Stack

● Front-end: ReactJS, NextJS, Tailwind CSS

● Back-end: NodeJS, ExpressJS

● Database: MongoDB (preferred), MySQL

● Version Control: Git

● Tools: Webpack, Docker (optional but a plus)


Preferred Location

This role is based out of our Andheri Office, Mumbai.


Growth Opportunities for You

● Lead exciting web application projects end-to-end and own key product initiatives.

● Develop cutting-edge apps used by leading media clients around the globe.

● Gain experience working in a high-growth company in the media and tech industry.

● Potential to grow into a team lead role.


Who Should Apply?

● Individuals with a passion for coding and web technologies.

● Minimum 3-5 years of experience in full-stack development using NodeJS, ReactJS,

NextJS, and Tailwind CSS.

● Strong understanding of both front-end and back-end development and ability to

write efficient, reusable, and scalable code.

● Familiarity with databases like MongoDB and MySQL.

● Experience with CI/CD pipelines and cloud infrastructure (AWS, Google Cloud) is a

plus.

● Team players with excellent communication skills and the ability to work in a

fast-paced environment.


Who Should Not Apply?

● If you're not comfortable with both front-end and back-end development.

● If you don’t enjoy problem-solving or tackling complex development challenges.

● If working in a dynamic, evolving environment doesn’t appeal to you.

Read more
ChicMic Studios
Akanksha Mittal
Posted by Akanksha Mittal
Mohali
2 - 5 yrs
₹7L - ₹17L / yr
Artificial Intelligence (AI)
pandas
Natural Language Processing (NLP)
NumPy
skill iconMachine Learning (ML)
+3 more

Job Description: AI/ML Specialist


We are looking for a highly skilled and experienced AI/ML Specialist to join our dynamic team. The ideal candidate will have a robust background in developing web applications using Django and Flask, with expertise in deploying and managing applications on AWS. Proficiency in Django Rest Framework (DRF), a solid understanding of machine learning concepts, and hands-on experience with tools like PyTorch, TensorFlow, and transformer architectures are essential.


Key Responsibilities


● Develop and maintain web applications using Django and Flask frameworks.

● Design and implement RESTful APIs using Django Rest Framework (DRF).

● Deploy, manage, and optimize applications on AWS services, including EC2, S3, RDS, Lambda, and CloudFormation.

● Build and integrate APIs for AI/ML models into existing systems.

● Create scalable machine learning models using frameworks like PyTorch, TensorFlow, and scikit-learn.

● Implement transformer architectures (e.g., BERT, GPT) for NLP and other advanced AI use cases.

● Optimize machine learning models through advanced techniques such as hyperparameter tuning, pruning, and quantization.

● Deploy and manage machine learning models in production environments using tools like TensorFlow Serving, TorchServe, and AWS SageMaker.

● Ensure the scalability, performance, and reliability of applications and deployed models.

● Collaborate with cross-functional teams to analyze requirements and deliver effective technical solutions.

● Write clean, maintainable, and efficient code following best practices.

● Conduct code reviews and provide constructive feedback to peers.

● Stay up-to-date with the latest industry trends and technologies, particularly in AI/ML.


Required Skills and Qualifications


● Bachelor’s degree in Computer Science, Engineering, or a related field.

● 3+ years of professional experience as a AI/ML Specialist


● Proficient in Python with a strong understanding of its ecosystem.

● Extensive experience with Django and Flask frameworks.

● Hands-on experience with AWS services for application deployment and management.

● Strong knowledge of Django Rest Framework (DRF) for building APIs.

● Expertise in machine learning frameworks such as PyTorch, TensorFlow, and scikit-learn.

● Experience with transformer architectures for NLP and advanced AI solutions.

● Solid understanding of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB).

● Familiarity with MLOps practices for managing the machine learning lifecycle.

● Basic knowledge of front-end technologies (e.g., JavaScript, HTML, CSS) is a plus.

● Excellent problem-solving skills and the ability to work independently and as part of a team.

● Strong communication skills and the ability to articulate complex technical concepts to non-technical stakeholders.

Read more
Techmero

at Techmero

1 recruiter
Shweta Parmar
Posted by Shweta Parmar
Vadodara
1 - 3 yrs
₹2.4L - ₹3L / yr
skill iconPython
Odoo (OpenERP)
pandas
NumPy
Linux administration
+7 more

Job Overview:

We are looking for a skilled and motivated Jr. Programmer Analyst with 2 years of hands-on experience in Python development and a strong understanding of software development principles. The ideal candidate should have experience with Odoo ORM, PostgreSQL, and API integration. If you have a passion for writing clean, optimized code and are excited about working in a product-based environment, we would love to meet you.


Key Responsibilities:

Develop, test, and maintain applications using Python (Pandas, NumPy, psycopg2).

Implement multi-threading and multi-processing where required.

Work on Odoo ORM, customizing and optimizing the application architecture.

Integrate third-party APIs and ensure smooth data flow between systems.

Optimize code for performance and scalability.

Collaborate with cross-functional teams using Agile methodologies.

Write efficient SQL queries and manage PostgreSQL databases.

Utilize Git for version control and contribute to CI/CD processes.

Work in a Linux environment for software development and deployment.

Support the team in product development from concept to deployment.


Technical Requirements (Must Have):

Strong proficiency in Python 3, especially:

PandasNumPy, Multi-threadingMulti-processing, psycopg2API Integration

Code optimization techniques.

Experience with Odoo ORM and understanding of its architecture

Experience in FastAPI / Flask.

Proficiency in PostgreSQL and writing complex SQL queries

Familiarity with GitHTMLCSS, and JavaScript.

Comfortable working on Linux OS.

Experience with Agile software development methodology.

Exposure to product development lifecycle.


Good to Have:

Basic knowledge of Docker.

Advanced proficiency with Linux.

Understanding of stock and crypto markets, especially candlestick patterns.


Perks & Benefits:

Opportunity to work in a fast-growing product environment.

Collaborative and supportive team culture.

Learning and development opportunities.


If you are passionate about technology and want to grow in a dynamic product-based company, we encourage you to apply!

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Indore
0 - 2 yrs
₹6L - ₹12L / yr
skill iconPython
pandas
Blockchain
GenAI
Generative AI


About Us

Alfred Capital - Alfred Capital is a next-generation on-chain proprietary quantitative trading technology provider, pioneering fully autonomous algorithmic systems that reshape trading and capital allocation in decentralized finance. 


As a sister company of Deqode — a 400+ person blockchain innovation powerhouse — we operate at the cutting edge of quant research, distributed infrastructure, and high-frequency execution.


What We Build

  • Alpha Discovery via On‑Chain Intelligence — Developing trading signals using blockchain data, CEX/DEX markets, and protocol mechanics.
  • DeFi-Native Execution Agents — Automated systems that execute trades across decentralized platforms.
  • ML-Augmented Infrastructure — Machine learning pipelines for real-time prediction, execution heuristics, and anomaly detection.
  • High-Throughput Systems — Resilient, low-latency engines that operate 24/7 across EVM and non-EVM chains tuned for high-frequency trading (HFT) and real-time response
  • Data-Driven MEV Analysis & Strategy — We analyze mempools, order flow, and validator behaviors to identify and capture MEV opportunities ethically—powering strategies that interact deeply with the mechanics of block production and inclusion.


Evaluation Process

  • HR Discussion – A brief conversation to understand your motivation and alignment with the role.
  • Initial Technical Interview – A quick round focused on fundamentals and problem-solving approach.
  • Take-Home Assignment – Assesses research ability, learning agility, and structured thinking.
  • Assignment Presentation – Deep-dive into your solution, design choices, and technical reasoning.
  • Final Interview – A concluding round to explore your background, interests, and team fit in depth.
  • Optional Interview – In specific cases, an additional round may be scheduled to clarify certain aspects or conduct further assessment before making a final decision.


Quantitative R&D Engineer


As a Quantitative R&D Engineer, you’ll explore data and design logic that becomes live trading strategies. You’ll bridge the gap between raw research and deployed, autonomous capital systems.


What You’ll Work On

  • Analyze on-chain and market data to identify inefficiencies and behavioral patterns.
  • Develop and prototype systematic trading strategies using statistical and ML-based techniques.
  • Contribute to signal research, backtesting infrastructure, and strategy evaluation frameworks.
  • Monitor and interpret DeFi protocol mechanics (AMMs, perps, lending markets) for alpha generation.
  • Collaborate with engineers to turn research into production-grade, automated trading systems.


Ideal Traits

  • Strong in data structures, algorithms, and core CS fundamentals.
  • Proficiency in any programming language
  • Understanding of probability, statistics, or ML concepts.
  • Self-driven and comfortable with ambiguity, iteration, and fast learning cycles.
  • Strong interest in markets, trading, or algorithmic systems.


Bonus Points For

  • Experience with backtesting or feature engineering.
  • Exposure to crypto primitives (AMMs, perps, mempools, etc.)
  • Projects involving alpha signals, strategy testing, or DeFi bots.
  • Participation in quant contests, hackathons, or open-source work.


What You’ll Gain:

  • Cutting-Edge Tech Stack: You'll work on modern infrastructure and stay up to date with the latest trends in technology.
  • Idea-Driven Culture: We welcome and encourage fresh ideas. Your input is valued, and you're empowered to make an impact from day one.
  • Ownership & Autonomy: You’ll have end-to-end ownership of projects. We trust our team and give them the freedom to make meaningful decisions.
  • Impact-Focused: Your work won’t be buried under bureaucracy. You’ll see it go live and make a difference in days, not quarters


What We Value:

  • Craftsmanship over shortcuts: We appreciate engineers who take the time to understand the problem deeply and build durable solutions—not just quick fixes.
  • Depth over haste: If you're the kind of person who enjoys going one level deeper to really "get" how something works, you'll thrive here.
  • Invested mindset: We're looking for people who don't just punch tickets, but care about the long-term success of the systems they build.
  • Curiosity with follow-through: We admire those who take the time to explore and validate new ideas, not just skim the surface.


Compensation:

  • INR 6 - 12 LPA
  • Performance Bonuses: Linked to contribution, delivery, and impact.
Read more
Hyderabad
1 - 3 yrs
₹4L - ₹6L / yr
route optimization,
Forecasting
LP
MILP
CP
+5 more

Job Title: Optimization Scientist – Route & Inventory Optimization (OR & RL)

Location: Hyderabad (On-site)

Experience Required: 1+ year


Company Overview:

We are a leading AI-driven supply chain solutions company focused on transforming retail, FMCG, and logistics through cutting-edge technologies in machine learning, operations research, and reinforcement learning. Our mission is to build intelligent systems that enhance decision-making and automate processes across forecasting, inventory, and transportation.

Internship Overview:

We are seeking a passionate and motivated AI/ML Intern to support the development of intelligent optimization systems for route planning and inventory allocation. You will work alongside experienced scientists and engineers, gaining hands-on experience in applying machine learning, reinforcement learning, and operations research to real-world logistics challenges.

Key Responsibilities:

🔹 Assist in Route Optimization Projects:

  • Support in modeling and solving simplified versions of Vehicle Routing Problems (VRP) under guidance.
  • Work with Python libraries like Pyomo or OR-Tools to prototype optimization solutions.
  • Explore reinforcement learning methods (e.g., DQN, PPO) for dynamic routing decisions under uncertainty.

🔹 Support Inventory Optimization Efforts:

  • Learn to model multi-echelon inventory systems using basic OR and simulation techniques.
  • Analyze historical data to understand stock levels, service times, and demand variability.
  • Help design experiments to evaluate replenishment strategies and stocking policies.

🔹 Contribute to AI-Driven Decision Systems:

  • Assist in integrating ML forecasting models with optimization pipelines.
  • Participate in the development or testing of simulation environments for training RL agents.
  • Collaborate with the team to evaluate model performance using historical or synthetic datasets.

Required Qualifications:

  • Currently pursuing or recently completed a degree in Computer Science, Data Science, Operations Research, Industrial Engineering, or related field.
  • Good understanding of Python and key libraries (NumPy, Pandas, Matplotlib, Scikit-learn).
  • Familiarity with basic optimization concepts (LP/MILP) and libraries like OR-Tools or Gurobi (student license).
  • Basic knowledge of reinforcement learning frameworks (OpenAI Gym, Stable-Baselines3) is a plus.
  • Strong problem-solving skills and willingness to learn advanced AI/OR techniques.

What You’ll Gain:

  • Hands-on exposure to real-world AI and optimization use cases in logistics and supply chain.
  • Mentorship from experienced scientists in OR, ML, and RL.
  • Experience working in a fast-paced, applied research environment.
  • Opportunity to convert to a full-time role based on performance and business needs.

About WINIT:

WINIT is a pioneer in mobile Sales Force Automation (mSFA) with over 25 years of experience. We serve more than 600 global enterprises, helping them enhance efficiency, streamline logistics, and leverage AI/ML to optimize sales operations. With a commitment to innovation and global support, WINIT continues to lead digital transformation in sales.

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore)
4 - 8 yrs
₹7L - ₹24L / yr
skill iconPython
NumPy
pandas
skill iconMachine Learning (ML)

·        Develop and maintain scalable back-end applications using Python frameworks such as Flask/Django/FastAPI.

·        Design, build, and optimize data pipelines for ETL processes using tools like PySpark, Airflow, and other similar technologies.

·        Work with relational and NoSQL databases to manage and process large datasets efficiently.

Collaborate with data scientists to clean, transform, and prepare data for analytics and machine learning models.

Work in a dynamic environment, at the intersection of software development and data engineering.

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by susmitha o
Bengaluru (Bangalore), Pune, Kolkata
4 - 6 yrs
₹7L - ₹24L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
NumPy
pandas

Key Technical Skillsets-

  • Design, develop, and maintain scalable applications using AWS services, Python, and Boto3.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement best practices for cloud architecture and application development.
  • Optimize applications for maximum speed and scalability.
  • Troubleshoot and resolve issues in development, test, and production environments.
  • Write clean, maintainable, and efficient code.
  • Participate in code reviews and contribute to team knowledge sharing.


Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhere India
5 - 10 yrs
₹8L - ₹30L / yr
skill iconPython
skill iconFlask
NumPy
pandas
SQL

Requirement Summary

•       4 to 10 years of experience inPython Developer

•       Have good communication skills

•       High energy and self-motivated professional with the ability to make things happen and adhere to strict deadlines

Mandatory Technical Skills

·        Should have strong software development experience, not necessarily in Python. Candidates with good experience software development with most of their experience in technologies like C# or Java if not entirely in Python but with minimum 3+ year experience in Python as developer.

·        Resources should have exposure to core python, design principles, OPPS concepts (classes, methods, decorators), data structure concepts & uses of relevant packages (numpy, pandas, etc.), testing (Pytest ) frameworks(TDD/BDD), cloud concepts(CI/CD), DB uses(SQL mainly), application security awareness, code quality checks, etc.

Responsibility of / Expectations from the Role

·        Developer efficient reusable codes towards model implementation

·        Participate in Agile Ceremony’s

·        Present the analysis outcome and performance to business and support adhoc analysis as required

Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
Payment gateways
skill iconRedis
+16 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon Sector - 48

Working Days : 6 Days WFO (Monday to Saturday)


Job Summary :

We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.

The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.


Main Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, Microservice Architecture, Third-party API integrations (e.g., payment gateways, SMS/email APIs), REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.


Key Responsibilities :

  • Write efficient, reusable, testable, and scalable code using the Django framework
  • Develop backend components, server-side logic, and statistical models
  • Design and implement high-availability, low-latency applications with robust data protection and security
  • Contribute to the development of highly responsive web applications
  • Collaborate with cross-functional teams on system design and integration

Mandatory Skills :

  • Strong programming skills in Python and Django (or similar frameworks like Flask).
  • Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
  • Strong understanding of SQL and NoSQL ORM.
  • Solid grasp of data structures, multithreading, and operating system concepts.
  • Experience with RESTful API development and implementation of API security.
  • Knowledge of JSON/XML and their use in data exchange.

Good-to-Have Skills :

  • Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka.
  • Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs).
  • Familiarity with MongoDB and other NoSQL databases.
  • Exposure to data science libraries such as Pandas, NumPy, Scikit-learn.
  • Knowledge in building and integrating statistical learning models.
Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
MySQL
SQL
+17 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon

Working Days : 6 Days (Monday to Saturday)


Job Summary :

We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.

The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.


Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.


Key Responsibilities :

  • Write efficient, reusable, testable, and scalable code using the Django framework.
  • Develop backend components, server-side logic, and statistical models.
  • Design and implement high-availability, low-latency applications with robust data protection and security.
  • Contribute to the development of highly responsive web applications.
  • Collaborate with cross-functional teams on system design and integration.

Mandatory Skills :

  • Strong programming skills in Python and Django (or similar frameworks like Flask).
  • Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
  • Strong understanding of SQL and NoSQL ORM.
  • Solid grasp of data structures, multithreading, and operating system concepts.
  • Experience with RESTful API development and implementation of API security.
  • Knowledge of JSON/XML and their use in data exchange.

Good-to-Have Skills :

  • Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
  • Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
  • Familiarity with MongoDB and other NoSQL databases
  • Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
  • Knowledge in building and integrating statistical learning models.
Read more
Poshmark

at Poshmark

3 candid answers
1 recruiter
Eman Khan
Posted by Eman Khan
Chennai
5 - 10 yrs
₹25L - ₹60L / yr
skill iconMachine Learning (ML)
skill iconPython
Scikit-Learn
NumPy
pandas
+9 more

Are you passionate about the power of data and excited to leverage cutting-edge AI/ML to drive business impact? At Poshmark, we tackle complex challenges in personalization, trust & safety, marketing optimization, product experience, and more.


Why Poshmark?

As a leader in Social Commerce, Poshmark offers an unparalleled opportunity to work with extensive multi-platform social and commerce data. With over 130 million users generating billions of daily events and petabytes of rapidly growing data, you’ll be at the forefront of data science innovation. If building impactful, data-driven AI solutions for millions excites you, this is your place.


What You’ll Do

  • Drive end-to-end data science initiatives, from ideation to deployment, delivering measurable business impact through projects such as feed personalization, product recommendation systems, and attribute extraction using computer vision.
  • Collaborate with cross-functional teams, including ML engineers, product managers, and business stakeholders, to design and deploy high-impact models.
  • Develop scalable solutions for key areas like product, marketing, operations, and community functions.
  • Own the entire ML Development lifecycle: data exploration, model development, deployment, and performance optimization.
  • Apply best practices for managing and maintaining machine learning models in production environments.
  • Explore and experiment with emerging AI trends, technologies, and methodologies to keep Poshmark at the cutting edge.


Your Experience & Skills

  • Ideal Experience: 6-9 years of building scalable data science solutions in a big data environment. Experience with personalization algorithms, recommendation systems, or user behavior modeling is a big plus.
  • Machine Learning Knowledge: Hands-on experience with key ML algorithms, including CNNs, Transformers, and Vision Transformers. Familiarity with Large Language Models (LLMs) and techniques like RAG or PEFT is a bonus.
  • Technical Expertise: Proficiency in Python, SQL, and Spark (Scala or PySpark), with hands-on experience in deep learning frameworks like PyTorch or TensorFlow. Familiarity with ML engineering tools like Flask, Docker, and MLOps practices.
  • Mathematical Foundations: Solid grasp of linear algebra, statistics, probability, calculus, and A/B testing concepts.
  • Collaboration & Communication: Strong problem-solving skills and ability to communicate complex technical ideas to diverse audiences, including executives and engineers.
Read more
Gaian Solutions India

at Gaian Solutions India

1 video
2 recruiters
Agency job
via AccioJob by AccioJobHiring Board
Hyderabad
0 - 0 yrs
₹4.5L - ₹6L / yr
skill iconPython
SQL
skill iconMachine Learning (ML)
pandas
TensorFlow
+1 more

AccioJob is conducting an offline hiring drive with Gaian Solutions India for the position of AI /ML Intern.


Required Skills - Python,SQL, ML libraries like (scikit-learn, pandas, TensorFlow, etc.)


Apply Here - https://go.acciojob.com/tUxTdV


Eligibility -

  • Degree: B.Tech/BE/BCA/MCA/M.Tech
  • Graduation Year: 2023, 2024, and 2025
  • Branch: All Branches
  • Work Location: Hyderabad


Compensation -

  • Internship stipend: 20- 25k 
  • Internship duration: 3 months
  • CTC:- 4.5-6 LPA


Evaluation Process -


  • Assessment at the AccioJob Skill Centre in Pune
  • 2 Technical Interviews


Apply Here - https://go.acciojob.com/tUxTdV


Important: Please bring your laptop & earphones for the test.

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Microsoft Windows Azure
SQL Azure
+7 more

🚀 Job Title : Python AI/ML Engineer

💼 Experience : 3+ Years

📍 Location : Gurgaon (Work from Office, 5 Days/Week)

📅 Notice Period : Immediate


Summary :

We are looking for a Python AI/ML Engineer with strong experience in developing and deploying machine learning models on Microsoft Azure.


🔧 Responsibilities :

  • Build and deploy ML models using Azure ML.
  • Develop scalable Python applications with cloud-first design.
  • Create data pipelines using Azure Data Factory, Blob Storage & Databricks.
  • Optimize performance, fix bugs, and ensure system reliability.
  • Collaborate with cross-functional teams to deliver intelligent features.

✅ Requirements :

  • 3+ Years of software development experience.
  • Strong Python skills; experience with scikit-learn, pandas, NumPy.
  • Solid knowledge of SQL and relational databases.
  • Hands-on with Azure ML, Data Factory, Blob Storage.
  • Familiarity with Git, REST APIs, Docker.
Read more
VoltusWave Technologies India Private Limited
Hyderabad
1 - 4 yrs
₹2L - ₹5L / yr
skill iconPython
Scikit-Learn
TensorFlow
PyTorch
Keras
+7 more

Job Title: AI & ML Developer

Experience: 1+ Years

Location: Hyderabad

Company: VoltusWave Technologies India Private Limited


Job Summary:

We are looking for a passionate and skilled AI & Machine Learning Developer with over 1 year of experience to join our growing team. You will be responsible for developing, implementing, and maintaining ML models and AI-driven applications that solve real-world business problems.


Key Responsibilities:

  • Design, build, and deploy machine learning models and AI solutions.
  • Work with large datasets to extract meaningful insights and develop algorithms.
  • Preprocess, clean, and transform raw data for training and evaluation.
  • Collaborate with data scientists, software developers, and product teams to integrate models into applications.
  • Monitor and maintain the performance of deployed models.
  • Stay updated with the latest developments in AI, ML, and data science.

Required Skills:

  • Strong understanding of machine learning algorithms and principles.
  • Experience with Python and ML libraries such as scikit-learn, TensorFlow, PyTorch, Keras, etc.
  • Familiarity with data processing tools like Pandas, NumPy, etc.
  • Basic knowledge of deep learning and neural networks.
  • Experience with data visualization tools (e.g., Matplotlib, Seaborn, Plotly).
  • Knowledge of model evaluation and optimization techniques.
  • Familiarity with version control (Git), Jupyter Notebooks, and cloud environments (AWS, GCP, or Azure) is a plus.

Educational Qualification:

  • Bachelor's or Master’s degree in Computer Science, Data Science, AI/ML, or a related field.

Nice to Have:

  • Exposure to NLP, Computer Vision, or Time Series Analysis.
  • Experience with ML Ops or deployment pipelines.
  • Understanding of REST APIs and integration of ML models with web apps.

Why Join Us:

  • Work on real-time AI & ML projects.
  • Opportunity to learn and grow in a fast-paced, innovative environment.
  • Friendly and collaborative team culture.
  • Career development support and training.


Read more
ChicMic Studios
Akanksha Mittal
Posted by Akanksha Mittal
Mohali
2 - 6 yrs
₹8L - ₹21L / yr
Natural Language Processing (NLP)
Named-entity recognition
skill iconData Science
Scikit-Learn
pandas
+1 more

We are seeking a Data Scientist with strong expertise in data analysis, machine learning, and visualization. The ideal candidate should be proficient in Python, Pandas, and Matplotlib, with experience in building and optimizing data-driven models. Some experience in Natural Language Processing (NLP) and Named Entity Recognition (NER) models would be a plus.


Analyze and process large datasets using Python and Pandas.

Develop and optimize machine learning models for predictive analytics.

Create data visualizations using Matplotlib and Seaborn to support decision-making.

Perform data cleaning, feature engineering, and statistical analysis.

Work with structured and unstructured data to extract meaningful insights.

Implement and fine-tune NER models for specific use cases (if required).

Collaborate with cross-functional teams to drive data-driven solutions.


Required Skills & Qualifications:

Strong proficiency in Python and data science libraries (Pandas, NumPy, Scikit-learn, etc.).

Experience in data analysis, statistical modeling, and machine learning.

Hands-on expertise in data visualization using Matplotlib and Seaborn.

Understanding of SQL and database querying.

Familiarity with NLP techniques and NER models is a plus.

Strong problem-solving and analytical skills.

Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Bengaluru (Bangalore), Pune, Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Navi Mumbai, Kolkata, Rajasthan
5 - 24 yrs
₹9L - ₹70L / yr
skill iconC
skill iconC++
Visual C++
Embedded C++
Artificial Intelligence (AI)
+32 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-fri role, In office, with excellent perks and benefits!


Position Overview

We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9


Key Responsibilities:

1. System Architecture & Design

● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.

● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.

● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.


2. Perception & AI Integration

● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.

● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.

● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.


3. Embedded & Real-Time Systems

● Design high-performance embedded software stacks for real-time robotic control and autonomy.

● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.

● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.


4. Robotics Simulation & Digital Twins

● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.

● Leverage synthetic data generation (Omniverse Replicator) for training AI models.

● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.


5. Navigation & Motion Planning

● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.

● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.

● Implement reinforcement learning-based policies using Isaac Gym.


6. Performance Optimization & Scalability

● Ensure low-latency AI inference and real-time execution of robotics applications.

● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.

● Develop benchmarking and profiling tools to measure software performance on edge AI devices.


Required Qualifications:

● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.

● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.

● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.

● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.

● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.

● Strong background in robotic perception, planning, and real-time control.

● Experience with cloud-edge AI deployment and scalable architectures.


Preferred Qualifications

● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym

● Knowledge of robot kinematics, control systems, and reinforcement learning

● Expertise in distributed computing, containerization (Docker), and cloud robotics

● Familiarity with automotive, industrial automation, or warehouse robotics

● Experience designing architectures for autonomous systems or multi-robot systems.

● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics

● Experience with microservices or service-oriented architecture (SOA)

● Knowledge of machine learning and AI integration within robotic systems

● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)

Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai, Navi Mumbai
5 - 40 yrs
₹8.5L - ₹75L / yr
Microservices
Architecture
API
NOSQL Databases
skill iconMongoDB
+33 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-Fri, In office role with excellent perks and benefits!


Key Responsibilities:

1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.

2. Build and implement scalable and robust microservices and integrate API gateways.

3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).

4. Implement real-time data pipelines using Kafka.

5. Collaborate with front-end developers to ensure seamless integration of backend services.

6. Write clean, reusable, and efficient code following best practices, including design patterns.

7. Troubleshoot, debug, and enhance existing systems for improved performance.


Mandatory Skills:

1. Proficiency in at least one backend technology: Node.js or Python, or Java.


2. Strong experience in:

i. Microservices architecture,

ii. API gateways,

iii. NoSQL databases (e.g., MongoDB, DynamoDB),

iv. Kafka

v. Data structures (e.g., arrays, linked lists, trees).


3. Frameworks:

i. If Java : Spring framework for backend development.

ii. If Python: FastAPI/Django frameworks for AI applications.

iii. If Node: Express.js for Node.js development.


Good to Have Skills:

1. Experience with Kubernetes for container orchestration.

2. Familiarity with in-memory databases like Redis or Memcached.

3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.

Read more
Tecblic Private LImited
Priya Khatri
Posted by Priya Khatri
Ahmedabad
4 - 5 yrs
₹11L - ₹15L / yr
Large Language Models (LLM)
Natural Language Processing (NLP)
Artificial Intelligence (AI)
skill iconDeep Learning
skill iconMachine Learning (ML)
+8 more

Job Description: Machine Learning Engineer – LLM and Agentic AI

Location: Ahmedabad

Experience: 4+ years

Employment Type: Full-Time

________________________________________

About Us

Join a forward-thinking team at Tecblic, where innovation meets cutting-edge technology. We specialize in delivering AI-driven solutions that empower businesses to thrive in the digital age. If you're passionate about LLMs, machine learning, and pushing the boundaries of Agentic AI, we’d love to have you on board.

________________________________________

Key Responsibilities

• Research and Development: Research, design, and fine-tune machine learning models, with a focus on Large Language Models (LLMs) and Agentic AI systems.

• Model Optimization: Fine-tune and optimize pre-trained LLMs for domain-specific use cases, ensuring scalability and performance.

• Integration: Collaborate with software engineers and product teams to integrate AI models into customer-facing applications and platforms.

• Data Engineering: Perform data preprocessing, pipeline creation, feature engineering, and exploratory data analysis (EDA) to prepare datasets for training and evaluation.

• Production Deployment: Design and implement robust model deployment pipelines, including monitoring and managing model performance in production.

• Experimentation: Prototype innovative solutions leveraging cutting-edge techniques like reinforcement learning, few-shot learning, and generative AI.

• Technical Mentorship: Mentor junior team members on best practices in machine learning and software engineering.

________________________________________

Requirements

Core Technical Skills:

• Proficiency in Python for machine learning and data science tasks.

• Expertise in ML frameworks and libraries like PyTorch, TensorFlow, Hugging Face, Scikit-learn, or similar.

• Solid understanding of Large Language Models (LLMs) such as GPT, T5, BERT, or Bloom, including fine-tuning techniques.

• Experience working on NLP tasks such as text classification, entity recognition, summarization, or question answering.

• Knowledge of deep learning architectures, such as transformers, RNNs, and CNNs.

• Strong skills in data manipulation using tools like Pandas, NumPy, and SQL.

• Familiarity with cloud services like AWS, GCP, or Azure, and experience deploying ML models using tools like Docker, Kubernetes, or serverless functions.

Additional Skills (Good to Have):

• Exposure to Agentic AI (e.g., autonomous agents, decision-making systems) and practical implementation.

• Understanding of MLOps tools (e.g., MLflow, Kubeflow) to streamline workflows and ensure production reliability.

• Experience with generative AI models (GANs, VAEs) and reinforcement learning techniques.

• Hands-on experience in prompt engineering and few-shot/fine-tuned approaches for LLMs.

• Familiarity with vector databases like Pinecone, Weaviate, or FAISS for efficient model retrieval.

• Version control (Git) and familiarity with collaborative development practices.

General Skills:

• Strong analytical and mathematical background, including proficiency in linear algebra, statistics, and probability.

• Solid understanding of algorithms and data structures to solve complex ML problems.

• Ability to handle and process large datasets using distributed frameworks like Apache Spark or Dask (optional but useful).

________________________________________

Soft Skills:

• Excellent problem-solving and critical-thinking abilities.

• Strong communication and collaboration skills to work with cross-functional teams.

• Self-motivated, with a continuous learning mindset to keep up with emerging technologies.

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Noida
4 - 8 yrs
₹2L - ₹10L / yr
skill iconMachine Learning (ML)
skill iconData Science
Azure OpenAI
skill iconPython
pandas
+11 more

Job Title : Sr. Data Scientist

Experience : 5+ Years

Location : Noida (Hybrid – 3 Days in Office)

Shift Timing : 2 PM to 11 PM

Availability : Immediate


Job Description :

We are seeking a Senior Data Scientist to develop and implement machine learning models, predictive analytics, and data-driven solutions.

The role involves data analysis, dashboard development (Looker Studio), NLP, Generative AI (LLMs, Prompt Engineering), and statistical modeling.

Strong expertise in Python (Pandas, NumPy), Cloud Data Science (AWS SageMaker, Azure OpenAI), Agile (Jira, Confluence), and stakeholder collaboration is essential.


Mandatory skills : Machine Learning, Cloud Data Science (AWS SageMaker, Azure OpenAI), Python (Pandas, NumPy), Data Visualization (Looker Studio), NLP & Generative AI (LLMs, Prompt Engineering), Statistical Modeling, Agile (Jira, Confluence), and strong stakeholder communication.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai, Bengaluru (Bangalore)
5 - 14 yrs
Best in industry
skill iconPython
skill iconAmazon Web Services (AWS)
SQL
pandas
Amazon Redshift

Job Description: 

Please find below details:


Experience - 5+ Years

Location- Bangalore/Python


Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

  • Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
  • Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
  • Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
  • Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
  • Ensure data quality and consistency by implementing validation and governance practices.
  • Work on data security best practices in compliance with organizational policies and regulations.
  • Automate repetitive data engineering tasks using Python scripts and frameworks.
  • Leverage CI/CD pipelines for deployment of data workflows on AWS.

 

Required Skills and Qualifications

  • Professional Experience: 5+ years of experience in data engineering or a related field.
  • Programming: Strong proficiency in Python, with experience in libraries like pandaspySpark, or boto3.
  • AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
  • AWS Glue for ETL/ELT.
  • S3 for storage.
  • Redshift or Athena for data warehousing and querying.
  • Lambda for serverless compute.
  • Kinesis or SNS/SQS for data streaming.
  • IAM Roles for security.
  • Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
  • Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
  • DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
  • Version Control: Proficient with Git-based workflows.
  • Problem Solving: Excellent analytical and debugging skills.

 

Optional Skills

  • Knowledge of data modeling and data warehouse design principles.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
  • Exposure to other programming languages like Scala or Java.






Read more
Codezen Tech Solutions

at Codezen Tech Solutions

1 recruiter
Noorun Rehmani
Posted by Noorun Rehmani
Mumbai, Navi Mumbai, Raipur
3 - 5 yrs
₹7L - ₹15L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconGit
Celery
+6 more

Responsibilities and Duties:

  • Expert in Python, with knowledge of Python web framework - Django
  • Familiarity with backend automated testing tools and frameworks
  • Experience with backend/API development
  • Architect, develop and maintain backend libraries/codebase, database & server.
  • Develop object-oriented software, with mastery of one or more relevant languages (Django).
  • Evaluate competitive and innovative products and design approaches to identify best practices and encourage innovation.
  • Understanding the requirement of a client, document the scope and chart out a plan of implementing the scope
  • Work with design team to give inputs related to the wire-frames and then the design along with incorporating Client Feedback
  • Explore the difference between B2B and B2C projects before implementing the code
  • Work in teams of 2-3 on various projects as per the requirement using git as version control
  • Having good knowledge of APIs creation and database architecture
  • Good Grasp in respective technology (Django)
  • Documenting the process and main functions along the developing process
  • Design and develop highly scalable, highly available, reliable, secure, and fault-tolerant systems with minimal guidance for one of the fastest-growing companies in India.

Required Experience, Skills and Qualifications:

  • 3-5 years of experience required
  • Strong hand on Django-Python
  • Excellent knowledge of using the Git version control system and deployment via Git.
  • You have creative visualization, critical thinking, deductive & pragmatic reasoning and can think out-of-the-box
  • Ability to quickly adapt & independently work in a fast-paced Agile environment with minimum supervision.
  • A self-starter with demonstrated ability to take initiative, who can proactively identify issues/opportunities and recommend action.
Read more
Experiencecom
Remote only
7 - 12 yrs
₹20L - ₹35L / yr
Google Cloud Platform (GCP)
Big Data
skill iconPython
SQL
pandas
+3 more

Description


Come Join Us


Experience.com - We make every experience matter more

Position: Senior GCP Data Engineer

Job Location: Chennai (Base Location) / Remote

Employment Type: Full Time


Summary of Position

A Senior Data Engineer is a professional who specializes in preparing big data infrastructure for analytical or operational uses. He/She is responsible for develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity. They collaborate with data scientists and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organisation.


Responsibilities:

  • Collaborate with cross-functional teams to define, prioritize, and execute data engineering initiatives aligned with business objectives.
  • Design and implement scalable, reliable, and secure data solutions by industry best practices and compliance requirements.
  • Drive the adoption of cloud-native technologies and architectural patterns to optimize the performance, cost, and reliability of data pipelines and analytics solutions.
  • Mentor and lead a team of Data Engineers.
  • Demonstrate a drive to learn and master new technologies and techniques.
  • Apply strong problem-solving skills with an emphasis on building data-driven or AI-enhanced products.
  • Coordinate with ML/AI and engineering teams to understand data requirements.


Experience & Skills:

  • 8+ years of Strong experience in ETL and ELT data from various sources in Data Warehouses
  • 8+ years of experience in Python, Pandas, Numpy, and SciPy.
  • 5+ years of Experience in GCP 
  • 5+ years of Experience in BigQuery, PySpark, and Pub/Sub
  • 5+ years of Experience working with and creating data architectures.
  • Certified in Google Cloud Professional Data Engineer.
  • Advanced proficiency in Google Cloud services such as Dataflow, Dataproc, Dataprep, Data Studio, and Cloud Composer.
  • Proficient in writing complex Spark (PySpark) User Defined Functions (UDFs), Spark SQL, and HiveQL.
  • Good understanding of Elastic search.
  • Experience in assessing and ensuring data quality, data testing, and addressing data quality issues.
  • Excellent understanding of Spark architecture and underlying frameworks including storage management.
  • Solid background in database design and development, database administration, and software engineering across full life cycles.
  • Experience with NoSQL data stores like MongoDB, DocumentDB, and DynamoDB.
  • Knowledge of data governance principles and practices, including data lineage, metadata management, and access control mechanisms.
  • Experience in implementing and optimizing data security controls, encryption, and compliance measures in GCP environments.
  • Ability to troubleshoot complex issues, perform root cause analysis, and implement effective solutions in a timely manner.
  • Proficiency in data visualization tools such as Tableau, Looker, or Data Studio to create insightful dashboards and reports for business users.
  • Strong communication and interpersonal skills to effectively collaborate with technical and non-technical stakeholders, articulate complex concepts, and drive consensus.
  • Experience with agile methodologies and project management tools like Jira or Asana for sprint planning, backlog grooming, and task tracking.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Tony Tom
Posted by Tony Tom
Bengaluru (Bangalore)
2 - 5 yrs
Best in industry
uipath
skill iconPython
NumPy
pandas

We are seeking a talented UiPath Developer with experience in Python, SQL, Pandas, and NumPy to join our dynamic team. The ideal candidate will have hands-on experience developing RPA workflows using UiPath, along with the ability to automate processes through scripting, data manipulation, and database queries.

This role offers the opportunity to collaborate with cross-functional teams to streamline operations and build innovative automation solutions.

Key Responsibilities:

  • Design, develop, and implement RPA workflows using UiPath.
  • Build and maintain Python scripts to enhance automation capabilities.
  • Utilize Pandas and NumPy for data extraction, manipulation, and transformation within automation processes.
  • Write optimized SQL queries to interact with databases and support automation workflows.

Skills and Qualifications:

  • 2 to 5 years of experience in UiPath development.
  • Strong proficiency in Python and working knowledge of Pandas and NumPy.
  • Good experience with SQL for database interactions.
  • Ability to design scalable and maintainable RPA solutions using UiPath.
Read more
Gandhinagar
0 - 2 yrs
₹1L - ₹2.5L / yr
NumPy
pandas
matplotlib
Audio

Job Title: (Generative AI Engineer Specialist in Deep Learning)


Location: Gandhinagar, Ahmedabad, Gujarat

Company: Rayvat Outsourcing

Salary: Upto 2,50,000/- per annum


Job Type: Full-Time


Experience: 0 to 1 Year


Job Overview:


We are seeking a talented and enthusiastic Generative AI Engineer to join our team. As an Intermediate-level engineer, you will be responsible for developing and deploying state-of-the-art generative AI models to solve complex problems and create innovative solutions. You will collaborate with cross-functional teams, working on a variety of projects that range from natural language processing (NLP) to image generation and multimodal AI systems. The ideal candidate has hands-on experience with machine learning models, deep learning techniques, and a passion for artificial intelligence.


Key Responsibilities:


·        Develop, fine-tune, and deploy generative AI models using frameworks such as GPT, BERT, DALL·E, Stable Diffusion, etc.

·        Research and implement cutting-edge machine learning algorithms in NLP, computer vision, and multimodal systems.

·        Collaborate with data scientists, ML engineers, and product teams to integrate AI solutions into products and platforms.

·        Create APIs and pipelines to deploy models in production environments, ensuring scalability and performance.

·        Analyze large datasets to identify key features, patterns, and use cases for model training.

·        Debug and improve existing models by evaluating performance metrics and applying optimization techniques.

·        Stay up-to-date with the latest advancements in AI, deep learning, and generative models to continually enhance the solutions.

·        Document technical workflows, including model architecture, training processes, and performance reports.

·        Ensure ethical use of AI, adhering to guidelines around AI fairness, transparency, and privacy.


Qualifications:


·        Bachelor’s/Master’s degree in Computer Science, Machine Learning, Data Science, or a related field.

·        2-4 years of hands-on experience in machine learning and AI development, particularly in generative AI.

·        Proficiency with deep learning frameworks such as TensorFlow, PyTorch, or similar.

·        Experience with NLP models (e.g., GPT, BERT) or image-generation models (e.g., GANs, diffusion models).

·        Strong knowledge of Python and libraries like NumPy, Pandas, scikit-learn, etc.

·        Experience with cloud platforms (e.g., AWS, GCP, Azure) for AI model deployment and scaling.

·        Familiarity with APIs, RESTful services, and microservice architectures.

·        Strong problem-solving skills and the ability to troubleshoot and optimize AI models.

·        Good understanding of data preprocessing, feature engineering, and handling large datasets.

·        Excellent written and verbal communication skills, with the ability to explain complex concepts clearly.


Preferred Skills:


·        Experience with multimodal AI systems (combining text, image, and/or audio data).

·        Familiarity with ML Ops and CI/CD pipelines for deploying machine learning models.

·        Experience in A/B testing and performance monitoring of AI models in production.

·        Knowledge of ethical AI principles and AI governance.


What We Offer:


·        Competitive salary and benefits package.

·        Opportunities for professional development and growth in the rapidly evolving AI field.

·        Collaborative and dynamic work environment, with access to cutting-edge AI technologies.

·        Work on impactful projects with real-world applications.

Read more
Remote only
4 - 5 yrs
₹9.6L - ₹12L / yr
SQL
RESTful APIs
skill iconPython
pandas
ETL

We are seeking a Data Engineer ( Snowflake, Bigquery, Redshift) to join our team. In this role, you will be responsible for the development and maintenance of fault-tolerant pipelines, including multiple database systems.


Responsibilities:

  • Collaborate with engineering teams to create REST API-based pipelines for large-scale MarTech systems, optimizing for performance and reliability.
  • Develop comprehensive data quality testing procedures to ensure the integrity and accuracy of data across all pipelines.
  • Build scalable dbt models and configuration files, leveraging best practices for efficient data transformation and analysis.
  • Partner with lead data engineers in designing scalable data models.
  • Conduct thorough debugging and root cause analysis for complex data pipeline issues, implementing effective solutions and optimizations.
  • Follow and adhere to group's standards such as SLAs, code styles, and deployment processes.
  • Anticipate breaking changes to implement backwards compatibility strategies regarding API schema changesAssist the team in monitoring pipeline health via observability tools and metrics.
  • Participate in refactoring efforts as platform application needs evolve over time.


Requirements:

  • Bachelor's degree or higher in Computer Science, Engineering, Mathematics, or a related field.
  • 3+ years of professional experience with a cloud database such as Snowflake, Bigquery, Redshift.
  • +1 years of professional experience with dbt (cloud or core).
  • Exposure to various data processing technologies such as OLAP and OLTP and their applications in real-world scenarios.
  • Exposure to work cross-functionally with other teams such as Product, Customer Success, Platform Engineering.
  • Familiarity with orchestration tools such as Dagster/Airflow.
  • Familiarity with ETL/ELT tools such as dltHub/Meltano/Airbyte/Fivetran and DBT.
  • High intermediate to advanced SQL skills (comfort with CTEs, window functions).
  • Proficiency with Python and related libraries (e.g., pandas, sqlalchemy, psycopg2) for data manipulation, analysis, and automation.


Benefits:

  • Work Location: Remote
  • 5 days working​


You can apply directly through the link:https://zrec.in/e9578?source=CareerSite


Explore our Career Page for more such jobs : careers.infraveo.com



Read more
Someshwara Software
Chandana Kandukur
Posted by Chandana Kandukur
Bengaluru (Bangalore)
2 - 4 yrs
₹4L - ₹12L / yr
skill iconPython
TensorFlow
pandas
skill iconGit
skill iconFlask
+6 more

 Job Description: AI/ML Engineer

 

Location: Bangalore (On-site)  

Experience: 2+ years of relevant experience

 

About the Role:

We are seeking a skilled and passionate AI/ML Engineer to join our team in Bangalore. The ideal candidate will have over two years of experience in developing, deploying, and maintaining AI and machine learning models. As an AI/ML Engineer, you will work closely with our data science team to build innovative solutions and deploy them in a production environmen

 

 Key Responsibilities:

- Develop, implement, and optimize machine learning models.

- Perform data manipulation, exploration, and analysis to derive actionable insights.

- Use advanced computer vision techniques, including YOLO and other state-of-the-art methods, for image processing and analysis.

- Collaborate with software developers and data scientists to integrate AI/ML solutions into the company's applications and products.

- Design, test, and deploy scalable machine learning solutions using TensorFlow, OpenCV, and other related technologies.

- Ensure the efficient storage and retrieval of data using SQL and data manipulation libraries such as pandas and NumPy.

- Contribute to the development of backend services using Flask or Django for deploying AI models.

- Manage code using Git and containerize applications using Docker when necessary.

- Stay updated with the latest advancements in AI/ML and integrate them into existing projects.

 

Required Skills:

- Proficiency in Python and its associated libraries (NumPy, pandas).

- Hands-on experience with TensorFlow for building and training machine learning models.

- Strong knowledge of linear algebra and data augmentation techniques.

- Experience with computer vision libraries like OpenCV and frameworks like YOLO.

- Proficiency in SQL for database management and data extraction.

- Experience with Flask for backend development.

- Familiarity with version control using Git.

 

 Optional Skills:

- Experience with PyTorch, Scikit-learn, and Docker.

- Familiarity with Django for web development.

- Knowledge of GPU programming using CuPy and CUDA.

- Understanding of parallel processing techniques.

 

Qualifications:

- Bachelor's degree in Computer Science, Engineering, or a related field.

- Demonstrated experience in AI/ML, with a portfolio of past projects.

- Strong analytical and problem-solving skills.

- Excellent communication and teamwork skills.

 

 Why Join Us?

- Opportunity to work on cutting-edge AI/ML projects.

- Collaborative and dynamic work environment.

- Competitive salary and benefits.

- Professional growth and development opportunities.

 

If you're excited about using AI/ML to solve real-world problems and have a strong technical background, we'd love to hear from you!

 

Apply now to join our growing team and make a significant impact!

Read more
TVARIT GmbH

at TVARIT GmbH

2 candid answers
Shivani Kawade
Posted by Shivani Kawade
Pune
4 - 6 yrs
₹15L - ₹25L / yr
PyTorch
skill iconPython
Scikit-Learn
NumPy
pandas
+2 more

Who are we looking for?  


We are looking for a Senior Data Scientist, who will design and develop data-driven solutions using state-of-the-art methods. You should be someone with strong and proven experience in working on data-driven solutions. If you feel you’re enthusiastic about transforming business requirements into insightful data-driven solutions, you are welcome to join our fast-growing team to unlock your best potential.  

 

Job Summary 

  • Supporting company mission by understanding complex business problems through data-driven solutions. 
  • Designing and developing machine learning pipelines in Python and deploying them in AWS/GCP, ... 
  • Developing end-to-end ML production-ready solutions and visualizations. 
  • Analyse large sets of time-series industrial data from various sources, such as production systems, sensors, and databases to draw actionable insights and present them via custom dashboards. 
  • Communicating complex technical concepts and findings to non-technical stakeholders of the projects 
  • Implementing the prototypes using suitable statistical tools and artificial intelligence algorithms. 
  • Preparing high-quality research papers and participating in conferences to present and report experimental results and research findings. 
  • Carrying out research collaborating with internal and external teams and facilitating review of ML systems for innovative ideas to prototype new models. 

 

Qualification and experience 

  • B.Tech/Masters/Ph.D. in computer science, electrical engineering, mathematics, data science, and related fields. 
  • 5+ years of professional experience in the field of machine learning, and data science. 
  • Experience with large-scale Time-series data-based production code development is a plus. 

 

Skills and competencies 

  • Familiarity with Docker, and ML Libraries like PyTorch, sklearn, pandas, SQL, and Git is a must. 
  • Ability to work on multiple projects. Must have strong design and implementation skills. 
  • Ability to conduct research based on complex business problems. 
  • Strong presentation skills and the ability to collaborate in a multi-disciplinary team. 
  • Must have programming experience in Python. 
  • Excellent English communication skills, both written and verbal. 


Benefits and Perks

  • Culture of innovation, creativity, learning, and even failure, we believe in bringing out the best in you. 
  • Progressive leave policy for effective work-life balance. 
  • Get mentored by highly qualified internal resource groups and opportunity to avail industry-driven mentorship program, as we believe in empowering people.  
  • Multicultural peer groups and supportive workplace policies.  
  • Work from beaches, hills, mountains, and many more with the yearly workcation program; we believe in mixing elements of vacation and work. 


 Hiring Process 

  • Call with Talent Acquisition Team: After application screening, a first-level screening with the talent acquisition team to understand the candidate's goals and alignment with the job requirements. 
  • First Round: Technical round 1 to gauge your domain knowledge and functional expertise. 
  • Second Round: In-depth technical round and discussion about the departmental goals, your role, and expectations.
  • Final HR Round: Culture fit round and compensation discussions.
  • Offer: Congratulations you made it!  


If this position sparked your interest, apply now to initiate the screening process.

Read more
golden eagle it technologies pvt ltd
Akansha Kanojia
Posted by Akansha Kanojia
Indore
4 - 5 yrs
₹3L - ₹15L / yr
skill iconPython
skill iconDjango
skill iconFlask
AWS Lambda
NumPy
+2 more

Job Description:-

Designation : Python Developer

Location : Indore | WFO

Skills : Python, Django, Flask, Numpy, Panda,  RESTful APIs, AWS.

Python Developer Responsibilities:- 

1. Coordinating with development teams to determine application requirements.

2. Writing scalable code using Python programming language.

3. Testing and debugging applications.

4. Developing back-end components.

5. Integrating user-facing elements using server-side logic.

6. Assessing and prioritizing client feature requests.

7. Integrating data storage solutions.

8. Coordinating with front-end developers.

9. Reprogramming existing databases to improve functionality.

10. Developing digital tools to monitor online traffic.

Python Developer Requirements:-

1. Bachelor's degree in computer science, computer engineering, or related field.

2. At Least 3+ years of experience as a Python developer.

3. Expert knowledge of Python and related frameworks including Django and Flask.

4. A deep understanding and multi-process architecture and the threading limitations of Python.

5. Familiarity with server-side templating languages including Jinja 2 and Mako.

6. Ability to integrate multiple data sources into a single system.

7. Familiarity with testing tools.

8. Ability to collaborate on projects and work independently when required.

Skills - Python, Django, Flask, Numpy, Panda,  RESTful APIs, AWS.

Read more
P99soft
anu sha
Posted by anu sha
Bengaluru (Bangalore)
6 - 12 yrs
₹10L - ₹15L / yr
skill iconPython
skill iconFlask
pandas
Web Development
azure

Role / Designation : Python Developer

 

Location: Bangalore, India

Skills : Certification: AI900, AZ900 Technical or Key Skills: Primary Skills Python, Flask, Web development. Knowledge on Azure Cloud, Application development, API development


Profile: IT Professional with 6 +years of experience in 

• Hands on experience Python libraries such as Pandas, Numpy , OpenPyxl 

• Hands on experience of Python libraries with multiple document types (excel, csv, pdf and images) 

• Working with huge data sets, data analysis and provide ETL and EDA analysis reports.

 • 5+ years’ experience in any of the programming languages like Python(mandatory), Java and C/C++.

 • Must have experience in Azure Paas, Iaas services like Azure function app, Kubernetes services, storage account, keyvault , etc 

• Experience with DB such as SQL,NoSQL 

• Develop methodologies for the data analysis, data extraction, data transformations, preprocessing of data. 

• Experience in deploying applications, packages in Azure environment. 

• Writing scalable code using Python programming language.

 • Testing and debugging applications.

 • Developing back-end components.

 • Integrating user-facing elements using server-side logic. 

• Excellent problem solving/analytical skills and complex troubleshooting methods.

 • Ability to work through ambiguous situations.

 • Excellent presentation, verbal, and written communication skills. Education: BE/BTech/BSc 


Certification: AI900, AZ900 Technical or Key Skills: Primary Skills Python, Flask, Web development. Knowledge on Azure Cloud, Application development, API development



Read more
Wheelseye Technology

at Wheelseye Technology

5 recruiters
Mohit Sharma
Posted by Mohit Sharma
Gurugram
2 - 5 yrs
₹5L - ₹15L / yr
SQL
pandas
skill iconPython
A/B Testing

REQUIREMENTS

Core skills:

Technical Experience (Must have) - working knowledge of any visualization tool

(Metabase,Tableau, QlikSense, Looker, Superset, Power BI etc), strong SQL & Python,

Excel/Gsheet

Product Knowledge (Must have)- Knowledge of Google Analytics/ BigQuery or

Mixpanel, must have worked on A/B testing & events writing.Must be familiar with

product (app,website) data and have good product sense

Analytical Thinking: Outstanding analytical and problem-solving skills. ability to break

the problem statement during execution.


Core Experience:

● Overall experience of 2-5 years in the analytics domain

● He/she should have hands-on experience in the analytics domain around making Data

Story Dashboards, doing RCA & analyzing data.

● Understand and hands-on experience of the Product i.e funnels, A/B experiment etc.

● Ability to define the right metric for a specific product feature or experiment & do the

impact analysis.

● Ability to explain complex data insights to a wider audience & tell us the next steps &

recommendations

● Experience in analyzing, exploring, and mining large data sets to support reporting and

ad-hoc analysis

● Strong attention to detail and accuracy of output.

Read more
Mitibase
Vaidehi Ghangurde
Posted by Vaidehi Ghangurde
Pune
2 - 4 yrs
₹6L - ₹8L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconReact.js
skill iconAngular (2+)
skill iconJavascript
+6 more

·      The Objective:

You will play a crucial role in designing, implementing, and maintaining our data infrastructure, run tests and update the systems


·      Job function and requirements

 

o  Expert in Python, Pandas and Numpy with knowledge of Python web Framework such as Django and Flask.

o  Able to integrate multiple data sources and databases into one system.

o  Basic understanding of frontend technologies like HTML, CSS, JavaScript.

o  Able to build data pipelines.

o  Strong unit test and debugging skills.

o  Understanding of fundamental design principles behind a scalable application

o  Good understanding of RDBMS databases among Mysql or Postgresql.

o  Able to analyze and transform raw data.

 

·      About us

Mitibase helps companies find warm prospects every month that are most relevant, and then helps their team to act on those with automation. We do so by automatically tracking key accounts and contacts for job changes and relationships triggers and surfaces them as warm leads in your sales pipeline.

Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
HR Team
Posted by HR Team
Remote only
3 - 6 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+5 more

Job Description: 

Machine Learning / AI Engineer (with 3+ years of experience)


We are seeking a highly skilled and passionate Machine Learning / AI Engineer to join our newly established data science practice area. In this role, you will primarily focus on working with Large Language Models (LLMs) and contribute to building generative AI applications. This position offers an exciting opportunity to shape the future of AI technology while charting an interesting career path within our organization.


Responsibilities:


1. Develop and implement machine learning models: Utilize your expertise in machine learning and artificial intelligence to design, develop, and deploy cutting-edge models, with a particular emphasis on Large Language Models (LLMs). Apply your knowledge to solve complex problems and optimize performance.


2. Building generative AI applications: Collaborate with cross-functional teams to conceptualize, design, and build innovative generative AI applications. Work on projects that push the boundaries of AI technology and deliver impactful solutions to real-world problems.


3. Data preprocessing and analysis: Collect, clean, and preprocess large volumes of data for training and evaluation purposes. Conduct exploratory data analysis to gain insights and identify patterns that can enhance the performance of AI models.


4. Model training and evaluation: Develop robust training pipelines for machine learning models, incorporating best practices in model selection, feature engineering, and hyperparameter tuning. Evaluate model performance using appropriate metrics and iterate on the models to improve accuracy and efficiency.


5. Research and stay up to date: Keep abreast of the latest advancements in machine learning, natural language processing, and generative AI. Stay informed about industry trends, emerging techniques, and open-source libraries, and apply relevant findings to enhance the team's capabilities.


6. Collaborate and communicate effectively: Work closely with a multidisciplinary team of data scientists, software engineers, and domain experts to drive AI initiatives. Clearly communicate complex technical concepts and findings to both technical and non-technical stakeholders.


7. Experimentation and prototyping: Explore novel ideas, experiment with new algorithms, and prototype innovative solutions. Foster a culture of innovation and contribute to the continuous improvement of AI methodologies and practices within the organization.


Requirements:


1. Education: Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Relevant certifications in machine learning, deep learning, or AI are a plus.


2. Experience: A minimum of 3+ years of professional experience as a Machine Learning / AI Engineer, with a proven track record of developing and deploying machine learning models in real-world applications.


3. Strong programming skills: Proficiency in Python and experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and libraries (e.g., scikit-learn, pandas). Experience with cloud platforms (e.g., AWS, Azure, GCP) for model deployment is preferred.


4. Deep-learning expertise: Strong understanding of deep learning architectures (e.g., convolutional neural networks, recurrent neural networks, transformers) and familiarity with Large Language Models (LLMs) such as GPT-3, GPT-4, or equivalent.


5. Natural Language Processing (NLP) knowledge: Familiarity with NLP techniques, including tokenization, word embeddings, named entity recognition, sentiment analysis, text classification, and language generation.


6. Data manipulation and preprocessing skills: Proficiency in data manipulation using SQL and experience with data preprocessing techniques (e.g., cleaning, normalization, feature engineering). Familiarity with big data tools (e.g., Spark) is a plus.


7. Problem-solving and analytical thinking: Strong analytical and problem-solving abilities, with a keen eye for detail. Demonstrated experience in translating complex business requirements into practical machine learning solutions.


8. Communication and collaboration: Excellent verbal and written communication skills, with the ability to explain complex technical concepts to diverse stakeholders


Read more
[x]cube LABS

at [x]cube LABS

2 candid answers
1 video
Krishna kandregula
Posted by Krishna kandregula
Hyderabad
2 - 6 yrs
₹8L - ₹20L / yr
ETL
Informatica
Data Warehouse (DWH)
PowerBI
DAX
+12 more
  • Creating and managing ETL/ELT pipelines based on requirements
  • Build PowerBI dashboards and manage datasets needed.
  • Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
  • Build data cubes for real-time visualisation needs and CXO dashboards.


Required Tech Skills


  • Microsoft PowerBI & DAX
  • Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
  • Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory



Read more
An 8 year old IT Services and consulting company.

An 8 year old IT Services and consulting company.

Agency job
via Startup Login by Shreya Sanchita
Remote, Hyderabad, Bengaluru (Bangalore)
8 - 15 yrs
₹20L - ₹55L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconData Analytics
skill iconData Science
+11 more

CTC Budget: 35-55LPA

Location: Hyderabad (Remote after 3 months WFO)


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


  • 6 plus years of experience as a Python developer.
  • Experience in web development using Python and Django Framework.
  • Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
  • Experience in developing User Interface using HTML, JavaScript, CSS.
  • Experience in server-side templating languages including Jinja 2 and Mako
  • Knowledge into Kafka and RabitMQ (GTH)
  • Experience into Docker, Git and AWS
  • Ability to integrate multiple data sources into a single system.
  • Ability to collaborate on projects and work independently when required.
  • DB (MySQL, Postgress, SQL)


Selection Process: 2-3 Interview rounds (Tech, VP, Client)

Read more
Cambridge Technology

at Cambridge Technology

2 recruiters
Muthyala Shirish Kumar
Posted by Muthyala Shirish Kumar
Hyderabad
2 - 15 yrs
₹10L - ₹40L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
recommendation algorithm
+7 more

From building entire infrastructures or platforms to solving complex IT challenges, Cambridge Technology helps businesses accelerate their digital transformation and become AI-first businesses. With over 20 years of expertise as a technology services company, we enable our customers to stay ahead of the curve by helping them figure out the perfect approach, solutions, and ecosystem for their business. Our experts help customers leverage the right AI, big data, cloud solutions, and intelligent platforms that will help them become and stay relevant in a rapidly changing world.


No Of Positions: 1


Skills required: 

  • The ideal candidate will have a bachelor’s degree in data science, statistics, or a related discipline with 4-6 years of experience, or a master’s degree with 4-6 years of experience. A strong candidate will also possess many of the following characteristics:
  • Strong problem-solving skills with an emphasis on achieving proof-of-concept
  • Knowledge of statistical techniques and concepts (regression, statistical tests, etc.)
  • Knowledge of machine learning and deep learning fundamentals
  • Experience with Python implementations to build ML and deep learning algorithms (e.g., pandas, numpy, sci-kit-learn, Stats Models, Keras, PyTorch, etc.)
  • Experience writing and debugging code in an IDE
  • Experience using managed web services (e.g., AWS, GCP, etc.)
  • Strong analytical and communication skills
  • Curiosity, flexibility, creativity, and a strong tolerance for ambiguity
  • Ability to learn new tools from documentation and internet resources.

Roles and responsibilities :

  • You will work on a small, core team alongside other engineers and business leaders throughout Cambridge with the following responsibilities:
  • Collaborate with client-facing teams to design and build operational AI solutions for client engagements.
  • Identify relevant data sources for data wrangling and EDA
  • Identify model architectures to use for client business needs.
  • Build full-stack data science solutions up to MVP that can be deployed into existing client business processes or scaled up based on clear documentation.
  • Present findings to teammates and key stakeholders in a clear and repeatable manner.

Experience :

2 - 14 Yrs

Read more
JRD Systems

at JRD Systems

1 recruiter
Lavanya B
Posted by Lavanya B
Bengaluru (Bangalore)
4 - 8 yrs
₹4L - ₹8L / yr
skill iconPython
skill iconDjango
skill iconFlask
Object Oriented Programming (OOPs)
skill iconAmazon Web Services (AWS)
+9 more

·      4+ years of experience as a Python Developer.

·      Good Understanding of Object-Oriented Concepts and Solid principles.

·      Good Understanding in Programming and analytical skills.

·      Should have hands on experience in AWS Cloud Service like S3, Lambda functions Knowledge. (Must Have)

·      Should have experience Working with large datasets (Must Have)

·      Proficient in using NumPy, Pandas. (Must Have)

·      Should have hands on experience on Mysql (Must Have)

·      Should have experience in debugging Python applications (Must have)

·      Knowledge of working on Flask.

·      Knowledge of object-relational mapping (ORM).

·      Able to integrate multiple data sources and databases into one system

·      Proficient understanding of code versioning tools such as Git, SVN

·      Strong at problem-solving and logical abilities

·      Sound knowledge of Front-end technologies like HTML5, CSS3, and JavaScript 

·      Strong commitment and desire to learn and grow.

Read more
Polybyte Technologies
Ahmedabad
1 - 3 yrs
₹4L - ₹10L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+6 more

We are seeking a skilled and motivated Python Full Stack Developer to join us. The ideal candidate will have experience with Python, JavaScript and its related technologies, as well as a passion for developing efficient and scalable software solutions.


Responsibilities:


  • Design and develop high-quality, scalable applications using Python, Django, DRF, FastAPI and JavaScript frameworks such as React or Vue.js
  • Analyze business requirements and develop software solutions to meet those needs
  • Write clean, maintainable, and efficient code
  • Test software solutions to ensure they meet performance, scalability, and reliability requirements
  • Debug and troubleshoot issues in the software
  • Stay up-to-date with emerging trends and technologies in Python development


Qualifications:


  • Bachelor's or Master's degree in Computer Science or related field
  • At least 2 years of experience in developing applications using Python, Django, DRF or FastAPI.
  • At least 2 years of experience in using front-end JavaScript frameworks such as Jquery, React or Vue.js
  • Experience with database technologies such as PostgreSQL and MongoDB
  • Experience with AWS or other cloud platforms
  • Ability to write clean and maintainable code
  • Strong analytical and problem-solving skills
  • Excellent written and verbal communication skills


Nice to have:


  • Knowledge of Trading in stocks, forex, futures etc.
  • Knowledge of automated trading
  • Experience with Different Trading Platforms


We offer:


  • Competitive salary
  • Flexible working hours


Job Types: Full-time, Regular / Permanent


Salary: ₹400,000.00 - ₹10,000.00 per year


Benefits:

  • Flexible schedule

Schedule:

  • Day shift
  • Monday to Friday

Supplemental pay types:

  • Overtime pay
  • Yearly bonus
  • Performance-based bonus

Ability to commute/relocate:

  • Mondeal heights, SG Highway, Ahmedabad - 380015, Gujarat: Reliably commute or planning to relocate before starting work (Required)

Education:

  • Bachelor's (Preferred)

Experience:

  • Python: 1-3 years (Required)
  • JavaScript: 1-3 years (Required)
Read more
[x]cube LABS

at [x]cube LABS

2 candid answers
1 video
Samudrala SaiAnvesh
Posted by Samudrala SaiAnvesh
Hyderabad
3 - 5 yrs
₹NaNL / yr
skill iconPython
skill iconDjango
skill iconFlask
PyData
pandas
+3 more

Job Description:


  • 3 - 4 years of hands-on Python programming & libraries like PyData, Pandas
  • Exposure to Mongo DB
  • Experience in writing Unit Test cases
  • Expertise in writing medium/advanced SQL Database queries
  • Strong Verbal/Written communication skills
  • Ability to work with onsite counterpart teams


Read more
Cubera Tech India Pvt Ltd
Surabhi Koushik
Posted by Surabhi Koushik
Bengaluru (Bangalore)
2 - 3 yrs
₹24L - ₹35L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
SQL
+6 more

Data Scientist

 

Cubera is a data company revolutionizing big data analytics and Adtech through data share value principles wherein the users entrust their data to us. We refine the art of understanding, processing, extracting, and evaluating the data that is entrusted to us. We are a gateway for brands to increase their lead efficiency as the world moves towards web3.

 

What you’ll do?

 

  • Build machine learning models, perform proof-of-concept, experiment, optimize, and deploy your models into production; work closely with software engineers to assist in productionizing your ML models.
  • Establish scalable, efficient, automated processes for large-scale data analysis, machine-learning model development, model validation, and serving.
  • Research new and innovative machine learning approaches.
  • Perform hands-on analysis and modeling of enormous data sets to develop insights that increase Ad Traffic and Campaign Efficacy.
  • Collaborate with other data scientists, data engineers, product managers, and business stakeholders to build well-crafted, pragmatic data products.  
  • Actively take on new projects and constantly try to improve the existing models and infrastructure necessary for offline and online experimentation and iteration.
  • Work with your team on ambiguous problem areas in existing or new ML initiatives

What are we looking for?

  • Ability to write a SQL query to pull the data you need.
  • Fluency in Python and familiarity with its scientific stack such as numpy, pandas, scikit learn, matplotlib.
  • Experience in Tensorflow and/or R Modelling and/or PyTorch
  • Ability to understand a business problem and translate, and structure it into a data science problem. 

 

Job Category: Data Science

Job Type: Full Time

Job Location: Bangalore

 

Read more
Chennai, Bengaluru (Bangalore), Hyderabad
5 - 8 yrs
₹10L - ₹25L / yr
skill iconPython
Robot Framework
Selenium
pandas

Experience : 5-8 years

 

Location : Bangalore,Chennai and hyderabad

 

 

 

Python Developer (1 Position)

Must have skills: 

·        Experience in advanced Python

·        Experience in GUI/Test Automation tools/libraries (Robot Framework, Selenium, & Sikuli etc.)

·        Ability to create UI Automation scripts to execute in Remote/Citrix servers

·        Knowledge in analytical libraries like Pandas, Numpy, Scipy, PyTorch etc.

·        AWS skillset

 

Nice to have skills:

·        Experience in SQL and Big Data analytic tools like Hive and Hue

·        Experience in Machine learning

·        Experience in Linux administration

Read more
Indium Software

at Indium Software

16 recruiters
Swaathipriya P
Posted by Swaathipriya P
Bengaluru (Bangalore), Hyderabad
2 - 5 yrs
₹1L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more
2+ years of Analytics with predominant experience in SQL, SAS, Statistics, R , Python, Visualization
Experienced in writing complex SQL select queries (window functions & CTE’s) with advanced SQL experience
Should be an individual contributor for initial few months based on project movement team will be aligned
Strong in querying logic and data interpretation
Solid communication and articulation skills
Able to handle stakeholders independently with less interventions of reporting manager
Develop strategies to solve problems in logical yet creative ways
Create custom reports and presentations accompanied by strong data visualization and storytelling
Read more
One of our Premium Client

One of our Premium Client

Agency job
Chennai
3 - 8 yrs
₹3L - ₹17L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
skill iconDeep Learning
+7 more

Job Description – Data Science  

 

Basic Qualification:

  • ME/MS from premier institute with a background in Mechanical/Industrial/Chemical/Materials engineering.
  • Strong Analytical skills and application of Statistical techniques to problem solving
  • Expertise in algorithms, data structures and performance optimization techniques
  • Proven track record of demonstrating end to end ownership involving taking an idea from incubator to market
  •   Minimum years of experience in data analysis (2+), statistical analysis, data mining, algorithms for optimization.

Responsibilities

The Data Engineer/Analyst will

  • Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
  • Clear interaction with Business teams including product planning, sales, marketing, finance for defining the projects, objectives.
  • Mine and analyze data from company databases to drive optimization and improvement of product and process development, marketing techniques and business strategies
  • Coordinate with different R&D and Business teams to implement models and monitor outcomes.
  • Mentor team members towards developing quick solutions for business impact.
  • Skilled at all stages of the analysis process including defining key business questions, recommending measures, data sources, methodology and study design, dataset creation, analysis execution, interpretation and presentation and publication of results.

 

Preferred Qualifications
  • 4+ years’ experience in MNC environment with projects involving ML, DL and/or DS
  • Experience in Machine Learning, Data Mining or Machine Intelligence (Artificial Intelligence)
  • Knowledge on Microsoft Azure will be desired.
  • Expertise in machine learning such as Classification, Data/Text Mining, NLP, Image Processing, Decision Trees, Random Forest, Neural Networks, Deep Learning Algorithms
  • Proficient in Python and its various libraries such as Numpy, MatPlotLib, Pandas
  • Superior verbal and written communication skills, ability to convey rigorous mathematical concepts and considerations to Business Teams.
  • Experience in infra development / building platforms is highly desired.
  • A drive to learn and master new technologies and techniques.
Read more
Novo

at Novo

2 recruiters
Dishaa Ranjan
Posted by Dishaa Ranjan
Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹25L - ₹35L / yr
SQL
skill iconPython
pandas
Scikit-Learn
TensorFlow
+1 more

About Us: 

Small businesses are the backbone of the US economy, comprising almost half of the GDP and the private workforce. Yet, big banks don’t provide the access, assistance and modern tools that owners need to successfully grow their business. 


We started Novo to challenge the status quo—we’re on a mission to increase the GDP of the modern entrepreneur by creating the go-to banking platform for small businesses (SMBs). Novo is flipping the script of the banking world, and we’re excited to lead the small business banking revolution.


At Novo, we’re here to help entrepreneurs, freelancers, startups and SMBs achieve their financial goals by empowering them with an operating system that makes business banking as easy as iOS. We developed modern bank accounts and tools to help to save time and increase cash flow. Our unique product integrations enable easy access to tracking payments, transferring money internationally, managing business transactions and more. We’ve made a big impact in a short amount of time, helping thousands of organizations access powerfully simple business banking.  



We are looking for a Senior Data Scientist who is enthusiastic about using data and technology to solve complex business problems. If you're passionate about leading and helping to architect and develop thoughtful data solutions, then we want to chat. Are you ready to revolutionize the small business banking industry with us?


About the Role: (specific to the role-- describe the role activities/duties, who they interact with, what they are accountable for, how the role operates in the team, department and organization)


  • Build and manage predictive models focussed on credit risk, fraud, conversions, churn, consumer behaviour etc
  • Provides best practices, direction for data analytics and business decision making across multiple projects and functional areas
  • Implements performance optimizations and best practices for scalable data models, pipelines and modelling
  • Resolve blockers and help the team stay productive
  • Take part in building the team and iterating on hiring processes

Requirements for the Role: (these are specific to the role-- technical skills and requirements to fulfill the job duties, certifications, years of experience, degree)


  • 4+ years of experience in data science roles focussed on managing data processes, modelling and dashboarding
  • Strong experience in python, SQL and in-depth understanding of modelling techniques
  • Experience working with Pandas, scikit learn, visualization libraries like plotly, bokeh etc.
  • Prior experience with credit risk modelling will be preferred
  • Deep Knowledge of Python to write scripts to manipulate data and generate automated  reports

How We Define Success: (these are specific to the role-- should be tied to performance management, OKRs or general goals)


  • Expand access to data driven decision making across the organization
  • Solve problems in risk, marketing, growth, customer behaviour through analytics models that increase efficacy

Nice To Have, but Not Required:

  • Experience in dashboarding libraries like Python Dash and exposure to CI/CD 
  • Exposure to big data tools like Spark, and some core tech knowledge around API’s, data streaming etc.


Novo values diversity as a core tenant of the work we do and the businesses we serve. We are an equal opportunity employer, indiscriminate of race, religion, ethnicity, national origin, citizenship, gender, gender identity, sexual orientation, age, veteran status, disability, genetic information or any other protected characteristic. 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort