Cutshort logo
Python Jobs in Delhi, NCR and Gurgaon

50+ Python Jobs in Delhi, NCR and Gurgaon | Python Job openings in Delhi, NCR and Gurgaon

Apply to 50+ Python Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Python Job opportunities across top companies like Google, Amazon & Adobe.

icon
Deqode

at Deqode

1 recruiter
Sneha Jain
Posted by Sneha Jain
Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Indore, Jaipur, Kolkata, Chennai, Bengaluru (Bangalore)
3.5 - 7 yrs
₹8L - ₹13L / yr
AWS Lambda
skill iconPython
Microservices
Amazon EC2

We are seeking a highly skilled and motivated Python Developer with hands-on experience in AWS cloud services (Lambda, API Gateway, EC2), microservices architecture, PostgreSQL, and Docker. The ideal candidate will be responsible for designing, developing, deploying, and maintaining scalable backend services and APIs, with a strong emphasis on cloud-native solutions and containerized environments.


Key Responsibilities:

  • Develop and maintain scalable backend services using Python (Flask, FastAPI, or Django).
  • Design and deploy serverless applications using AWS Lambda and API Gateway.
  • Build and manage RESTful APIs and microservices.
  • Implement CI/CD pipelines for efficient and secure deployments.
  • Work with Docker to containerize applications and manage container lifecycles.
  • Develop and manage infrastructure on AWS (including EC2, IAM, S3, and other related services).
  • Design efficient database schemas and write optimized SQL queries for PostgreSQL.
  • Collaborate with DevOps, front-end developers, and product managers for end-to-end delivery.
  • Write unit, integration, and performance tests to ensure code reliability and robustness.
  • Monitor, troubleshoot, and optimize application performance in production environments.


Required Skills:

  • Strong proficiency in Python and Python-based web frameworks.
  • Experience with AWS services: Lambda, API Gateway, EC2, S3, CloudWatch.
  • Sound knowledge of microservices architecture and asynchronous programming.
  • Proficiency with PostgreSQL, including schema design and query optimization.
  • Hands-on experience with Docker and containerized deployments.
  • Understanding of CI/CD practices and tools like GitHub Actions, Jenkins, or CodePipeline.
  • Familiarity with API documentation tools (Swagger/OpenAPI).
  • Version control with Git.


Read more
Vitubox Infotech Pvt Ltd
Virtubox Infotech
Posted by Virtubox Infotech
Noida
1 - 3 yrs
₹3L - ₹4L / yr
Systems Development Life Cycle (SDLC)
Selenium
JMeter
Appium
STLC
+5 more

Job Description : Software Testing 



VirtuBox, the world's premier B2B Cloud-based SaaS solution, empowers businesses to forge unforgettable customer experiences that transcend screens and ignite brand loyalty i.e. VirtuBox is Transforming Customer Journeys, One Pixel at a Time.


Job Summary :

We are seeking a proactive and detail-oriented Software Tester with 1–2 years of experience in manual and/or automation testing. The ideal candidate will work closely with developers and product teams to ensure high-quality software delivery by identifying bugs, writing test cases, and executing comprehensive test cycles.


Key Responsibilities :

  • Analyze software requirements and design test cases to ensure functionality and performance.
  • Identify, document, and track defects using bug-tracking tools.
  • Collaborate with developers and stakeholders to resolve issues and improve software quality.
  • Perform functional, regression, system, and performance testing.
  • Execute automated testing using tools like Selenium, JMeter, or Appium.
  • Participate in agile development processes, including stand-up meetings and sprint planning.
  • Prepare detailed test reports and documentation for stakeholders.
  • Conduct security and usability testing to ensure compliance with industry standards.
  • Manage test data to create realistic testing scenarios.
  • Validate bug fixes and ensure all functionalities work correctly before release.



Skill Required :


  1. Soft Skills 
  • Strong analytical and problem-solving skills.
  • Good communication and teamwork abilities.
  • Attention to detail and ability to work under deadlines.



  1. Technical skills
  • Knowledge of manual testing and automated testing tools (Selenium, JMeter, Appium, etc.).
  • Understanding of SDLC (Software Development Life Cycle) and STLC (Software Testing Life Cycle).
  • Familiarity with defect tracking tools (JIRA, Bugzilla, etc.).
  • Basic programming knowledge (Python, Java, SQL) is a plus.


Eligibility Criteria :

  • Bachelor’s degree in Computer Science, IT, or related field.
  • 1–2 years of hands-on experience in software testing.
  • Excellent analytical and communication skills.
  • ISTQB certification is desirable but not mandatory.
  • Basic knowledge of any scripting or programming language is a plus.
  • Strong problem-solving and analytical skills.


Read more
Masters India Private Limited

at Masters India Private Limited

3 candid answers
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Noida
7yrs+
Upto ₹45L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
skill iconPostgreSQL
skill iconMongoDB
+9 more

We are looking for a customer-obsessed, analytical Sr. Staff Engineer to lead the development and growth of our Tax Compliance product suite. In this role, you’ll shape innovative digital solutions that simplify and automate tax filing, reconciliation, and compliance workflows for businesses of all sizes. You will join a fast-growing company where you’ll work in a dynamic and competitive market, impacting how businesses meet their statutory obligations with speed, accuracy, and confidence.


As the Sr. Staff Engineer, you’ll work closely with product, DevOps, and data teams to architect reliable systems, drive engineering excellence, and ensure high availability across our platform. We’re looking for a technical leader who’s not just an expert in building scalable systems, but also passionate about mentoring engineers and shaping the future of fintech.


Responsibilities

  • Lead, mentor, and inspire a high-performing engineering team (or operate as a hands-on technical lead).
  • Drive the design and development of scalable backend services using Python.
  • Experience in Django, FastAPI, Task Orchestration Systems.
  • Own and evolve our CI/CD pipelines with Jenkins, ensuring fast, safe, and reliable deployments.
  • Architect and manage infrastructure using AWS and Terraform with a DevOps-first mindset.
  • Collaborate cross-functionally with product managers, designers, and compliance experts to deliver features that make tax compliance seamless for our users.
  • Set and enforce engineering best practices, code quality standards, and operational excellence.
  • Stay up-to-date with industry trends and advocate for continuous improvement in engineering processes.
  • Experience in fintech, tax, or compliance industries.
  • Familiarity with containerization tools like Docker and orchestration with Kubernetes.
  • Background in security, observability, or compliance automation.

Requirements

  • 7+ years of software engineering experience, with at least 2+ years in a leadership or principal-level role.
  • Deep expertise in Python, including API development, performance optimization, and testing.
  • Experience in Event-driven architecture, Kafka/RabbitMQ-like systems.
  • Strong experience with AWS services (e.g., ECS, Lambda, S3, RDS, CloudWatch).
  • Solid understanding of Terraform for infrastructure as code.
  • Proficiency with Jenkins or similar CI/CD tooling.
  • Comfortable balancing technical leadership with hands-on coding and problem-solving.
  • Strong communication skills and a collaborative mindset.
Read more
Lalitech

at Lalitech

1 recruiter
Govind Varshney
Posted by Govind Varshney
Remote, Bengaluru (Bangalore), Noida
5 - 10 yrs
₹7L - ₹20L / yr
Artificial Intelligence (AI)
Generative AI
skill iconPython
skill iconNodeJS (Node.js)
Vector database
+7 more

Location: Hybrid/ Remote

Type: Contract / Full‑Time

Experience: 5+ Years

Qualification: Bachelor’s or Master’s in Computer Science or a related technical field


Responsibilities:

  • Architect & implement the RAG pipeline: embeddings ingestion, vector search (MongoDB Atlas or similar), and context-aware chat generation.
  • Design and build Python‑based services (FastAPI) for generating and updating embeddings.
  • Host and apply LoRA/QLoRA adapters for per‑user fine‑tuning.
  • Automate data pipelines to ingest daily user logs, chunk text, and upsert embeddings into the vector store.
  • Develop Node.js/Express APIs that orchestrate embedding, retrieval, and LLM inference for real‑time chat.
  • Manage vector index lifecycle and similarity metrics (cosine/dot‑product).
  • Deploy and optimize on AWS (Lambda, EC2, SageMaker), containerization (Docker), and monitoring for latency, costs, and error rates.
  • Collaborate with frontend engineers to define API contracts and demo endpoints.
  • Document architecture diagrams, API specifications, and runbooks for future team onboarding.


Required Skills

  • Strong Python expertise (FastAPI, async programming).
  • Proficiency with Node.js and Express for API development.
  • Experience with vector databases (MongoDB Atlas Vector Search, Pinecone, Weaviate) and similarity search.
  • Familiarity with OpenAI’s APIs (embeddings, chat completions).
  • Hands‑on with parameters‑efficient fine‑tuning (LoRA, QLoRA, PEFT/Hugging Face).
  • Knowledge of LLM hosting best practices on AWS (EC2, Lambda, SageMaker).

Containerization skills (Docker):

  • Good understanding of RAG architectures, prompt design, and memory management.
  • Strong Git workflow and collaborative development practices (GitHub, CI/CD).


Nice‑to‑Have:

  • Experience with Llama family models or other open‑source LLMs.
  • Familiarity with MongoDB Atlas free tier and cluster management.
  • Background in data engineering for streaming or batch processing.
  • Knowledge of monitoring & observability tools (Prometheus, Grafana, CloudWatch).
  • Frontend skills in React to prototype demo UIs.
Read more
VDart
Don Blessing
Posted by Don Blessing
Hyderabad, Bengaluru (Bangalore), Noida, Gurugram
5 - 15 yrs
₹10L - ₹15L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
API

Job Description:


Title : Python AWS Developer with API

 

Tech Stack : AWS API gateway, Lambda functionality, Oracle RDS, SQL & database management, (OOPS) principles, Java script, Object relational Mapper, Git, Docker, Java dependency management, CI/CD, AWS cloud & S3, Secret Manager, Python, API frameworks, well-versed with Front and back end programming (python).

 

Responsibilities: 

·      Worked on building high-performance APIs using AWS services and Python. Python coding, debugging programs and integrating app with third party web services.

·      Troubleshoot and debug non-prod defects, back-end development, API, main focus on coding and monitoring applications.

·      Core application logic design.

·      Supports dependency teams in UAT testing and perform functional application testing which includes postman testing

 

Read more
IT services firm

IT services firm

Agency job
via AccioJob by AccioJobHiring Board
Noida
0 - 1 yrs
₹3L - ₹3.5L / yr
skill iconPython
skill iconDeep Learning
Prompt engineering

AccioJob is conducting a Walk-In Hiring Drive with IT services firm for the position of AI Engineer.


To apply, register and select your slot here: https://go.acciojob.com/283eXn


Required Skills: Python, Machine Learning, Deep Learning, Prompt Engineering


Eligibility:

Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc

Branch: Electrical/Other electrical related branches, Computer Science/CSE/Other CS related branch, IT

Graduation Year: 2023, 2024, 2025


Work Details:

Work Location: Noida (Onsite)

CTC: 3 LPA to 3.5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Noida, Delhi & Greater Noida Centres

Further Rounds (for shortlisted candidates only):

  • Profile & Background Screening Round
  • Technical Interview Round 1
  • Technical Interview Round 2
  • HR Interview Round


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/283eXn

Read more
IT services firm

IT services firm

Agency job
via AccioJob by AccioJobHiring Board
Noida
0 - 1 yrs
₹3L - ₹3.5L / yr
skill iconPython
skill iconJavascript
skill iconReact.js

AccioJob is conducting a Walk-In Hiring Drive with IT services firm for the position of Full Stack Developer.


To apply, register and select your slot here: https://go.acciojob.com/qhtfYQ


Required Skills: Python, JavaScript , React JS


Eligibility:

  • Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
  • Branch: Electrical/Other electrical related branches, Computer Science/CSE/Other CS related branch, IT
  • Graduation Year: 2023, 2024, 2025


Work Details:

  • Work Location: Noida (Onsite)
  • CTC: 3 LPA to 3.5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Noida, Delhi & Greater Noida Centres

Further Rounds (for shortlisted candidates only):

  • Profile & Background Screening Round
  • Technical Interview Round 1
  • Technical Interview Round 2


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/qhtfYQ

Read more
Webkul Software PvtLtd
Noida
0 - 2.5 yrs
₹3L - ₹10L / yr
skill iconPython
Selenium
pytest
skill iconDocker
gitlab
+5 more

Key Responsibilities

  • Design, develop, and maintain automated test scripts using Python, pytest, and Selenium for Salesforce and web applications.
  • Create and manage test environments using Docker to ensure consistent testing conditions.
  • Collaborate with developers, business analysts, and stakeholders to understand requirements and define test scenarios.
  • Execute automated and manual tests, analyze results, and report defects using GitLab or other tracking tools.
  • Perform regression, functional, and integration testing for Salesforce applications and customizations.
  • Ensure test coverage for Salesforce features, including custom objects, workflows, and Apex code.
  • Contribute to continuous integration/continuous deployment (CI/CD) pipelines in GitLab for automated testing.
  • Document test cases, processes, and results to maintain a comprehensive testing repository.
  • Stay updated on Salesforce updates, testing tools, and industry best practices.

Required Qualifications

  • 1-3 years of experience in automation testing, preferably with exposure to Salesforce applications.
  • Proficiency in Python, pytest, Selenium, Docker, and GitLab for test automation and version control.
  • Understanding of software testing methodologies, including functional, regression, and integration testing.
  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Strong problem-solving skills and attention to detail.
  • Excellent verbal and written communication skills.
  • Ability to work in a collaborative, fast-paced team environment.

Preferred Qualifications

  • Experience with Salesforce platform testing, including Sales Cloud, Service Cloud, or Marketing Cloud.
  • Active Salesforce Trailhead profile with demonstrated learning progress (please include Trailhead profile link in application).
  • Salesforce certifications (e.g., Salesforce Administrator or Platform Developer) are a plus.
  • Familiarity with testing Apex code, Lightning components, or Salesforce integrations.
  • Experience with Agile/Scrum methodologies.
  • Knowledge of Webkul’s product ecosystem or e-commerce platforms is an advantage.


Read more
Masters India Private Limited

at Masters India Private Limited

3 candid answers
2 recruiters
Neha Vidhyarthi
Posted by Neha Vidhyarthi
Noida
7 - 15 yrs
₹20L - ₹45L / yr
skill iconPython
skill iconDjango
Celery
FastAPI
skill iconAmazon Web Services (AWS)
+1 more

Sr. Staff Engineer Role


We are looking for a customer-obsessed, analytical Sr. Staff Engineer to lead the development

and growth of our Tax Compliance product suite. In this role, you’ll shape innovative digital

solutions that simplify and automate tax filing, reconciliation, and compliance workflows for

businesses of all sizes. You will join a fast-growing company where you’ll work in a dynamic and

competitive market, impacting how businesses meet their statutory obligations with speed,

accuracy, and confidence.

As the Sr. Staff Engineer, You’ll work closely with product, DevOps, and data teams to architect

reliable systems, drive engineering excellence, and ensure high availability across our platform.

We’re looking for a technical leader who’s not just an expert in building scalable systems, but also

passionate about mentoring engineers and shaping the future of fintech.

Responsibilities

● Lead, mentor, and inspire a high-performing engineering team (or operate as a hands-on

technical lead).

● Drive the design and development of scalable backend services using Python/Node.js.

● Experience in Django, FastApi, Task Orchestration Systems.

● Own and evolve our CI/CD pipelines with Jenkins, ensuring fast, safe, and reliable

deployments.

● Architect and manage infrastructure using AWS and Terraform with a DevOps-first mindset.

● Collaborate cross-functionally with product managers, designers, and compliance experts

to deliver features that make tax compliance seamless for our users.

● Set and enforce engineering best practices, code quality standards, and operational

excellence.

● Stay up-to-date with industry trends and advocate for continuous improvement in

engineering processes.

● Experience in fintech, tax, or compliance industries.

● Familiarity with containerization tools like Docker and orchestration with Kubernetes.

● Background in security, observability, or compliance automation.

Requirements

● 8+ years of software engineering experience, with at least 2+ years in a leadership or

principal-level role.

● Deep expertise in Python/Node.js, including API development, performance optimization,

and testing.

● Experience in Event-driven architecture, kafka/rabbitmq like

● Strong experience with AWS services (e.g., ECS, Lambda, S3, RDS, CloudWatch).

● Solid understanding of Terraform for infrastructure as code.

● Proficiency with Jenkins or similar CI/CD tooling.

● Comfortable balancing technical leadership with hands-on coding and problem-solving.

● Strong communication skills and a collaborative mindset.

Read more
Noida
5 - 10 yrs
₹30L - ₹40L / yr
Graph Databases
Neo4J
skill iconPython

About Us


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.


Position Overview:


Seeking an experienced Data Engineer to design, develop, and productionize graph database solutions using Neo4j for economic data analysis and modeling. This role requires expertise in graph database architecture, data pipeline development, and production system deployment.


Key Responsibilities


Graph Database Development

- Design and implement Neo4j graph database schemas for complex economic datasets

- Develop efficient graph data models representing economic relationships, transactions, and market dynamics

- Create and optimize Cypher queries for complex analytical workloads

- Build graph-based data pipelines for real-time and batch processing


Data Engineering & Pipeline Development

- Architect scalable data ingestion frameworks for structured and unstructured economic data

- Develop ETL/ELT processes to transform relational and time-series data into graph formats

- Implement data validation, quality checks, and monitoring systems

- Build APIs and services for graph data access and manipulation


Production Systems & Operations

- Deploy and maintain Neo4j clusters in production environments

- Implement backup, disaster recovery, and high availability solutions

- Monitor database performance, optimize queries, and manage capacity planning

- Establish CI/CD pipelines for graph database deployments


Economic Data Specialization

- Model financial market relationships, economic indicators, and trading networks

- Create graph representations of supply chains, market structures, and economic flows

- Develop graph analytics for fraud detection, risk assessment, and market analysis

- Collaborate with economists and analysts to translate business requirements into graph solutions


Required Qualifications


Technical Skills:

- **Neo4j Expertise**: 3+ years hands-on experience with Neo4j database development

- **Graph Modeling**: Strong understanding of graph theory and data modeling principles

- **Cypher Query Language**: Advanced proficiency in writing complex Cypher queries

- **Programming**: Python, Java, or Scala for data processing and application development

- **Data Pipeline Tools**: Experience with Apache Kafka, Apache Spark, or similar frameworks

- **Cloud Platforms**: AWS, GCP, or Azure with containerization (Docker, Kubernetes)


Database & Infrastructure

- Experience with graph database administration and performance tuning

- Knowledge of distributed systems and database clustering

- Understanding of data warehousing concepts and dimensional modeling

- Familiarity with other databases (PostgreSQL, MongoDB, Elasticsearch)


Economic Data Experience

- Experience working with financial datasets, market data, or economic indicators

- Understanding of financial data structures and regulatory requirements

- Knowledge of data governance and compliance in financial services


Preferred Qualifications

- **Neo4j Certification**: Neo4j Certified Professional or Graph Data Science certification

- **Advanced Degree**: Master's in Computer Science, Economics, or related field

- **Industry Experience**: 5+ years in financial services, fintech, or economic research

- **Additional Skills**: Machine learning on graphs, network analysis, time-series analysis


Technical Environment

- Neo4j Enterprise Edition with APOC procedures

- Apache Kafka for streaming data ingestion

- Apache Spark for large-scale data processing

- Docker and Kubernetes for containerized deployments

- Git, Jenkins/GitLab CI for version control and deployment

- Monitoring tools: Prometheus, Grafana, ELK stack



Application Requirements

- Portfolio demonstrating Neo4j graph database projects

- Examples of production graph systems you've built

- Experience with economic or financial data modeling preferred


Read more
Qurilo Solutions Pvt Ltd
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 2 yrs
₹3L - ₹3.6L / yr
skill iconNodeJS (Node.js)
skill iconExpress
skill iconMongoDB
SQL
skill iconPostgreSQL
+4 more


We are hiring a skilled Backend Developer to design and manage server-side applications, APIs, and database systems.

Key Responsibilities:

  • Develop and manage APIs with Node.js and Express.js.
  • Work with MongoDB and Mongoose for database management.
  • Implement secure authentication using JWT.
  • Optimize backend systems for performance and scalability.
  • Deploy backend services on VPS and manage servers.
  • Collaborate with frontend teams and use Git/GitHub for version control.

Required Skills:

  • Node.js, Express.js
  • MongoDB, Mongoose
  • REST API, JWT
  • Git, GitHub, VPS hosting

Qualifications:

  • Bachelor’s degree in Computer Science or related field.
  • Strong portfolio or GitHub profile preferred.


Read more
Deqode

at Deqode

1 recruiter
Sneha Jain
Posted by Sneha Jain
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
3.5 - 9 yrs
₹3L - ₹13L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
AWS Lambda
skill iconDjango
Amazon S3

Job Summary:

We are looking for a skilled and motivated Python AWS Engineer to join our team. The ideal candidate will have strong experience in backend development using Python, cloud infrastructure on AWS, and building serverless or microservices-based architectures. You will work closely with cross-functional teams to design, develop, deploy, and maintain scalable and secure applications in the cloud.

Key Responsibilities:

  • Develop and maintain backend applications using Python and frameworks like Django or Flask
  • Design and implement serverless solutions using AWS Lambda, API Gateway, and other AWS services
  • Develop data processing pipelines using services such as AWS Glue, Step Functions, S3, DynamoDB, and RDS
  • Write clean, efficient, and testable code following best practices
  • Implement CI/CD pipelines using tools like CodePipeline, GitHub Actions, or Jenkins
  • Monitor and optimize system performance and troubleshoot production issues
  • Collaborate with DevOps and front-end teams to integrate APIs and cloud-native services
  • Maintain and improve application security and compliance with industry standards

Required Skills:

  • Strong programming skills in Python
  • Solid understanding of AWS cloud services (Lambda, S3, EC2, DynamoDB, RDS, IAM, API Gateway, CloudWatch, etc.)
  • Experience with infrastructure as code (e.g., CloudFormation, Terraform, or AWS CDK)
  • Good understanding of RESTful API design and microservices architecture
  • Hands-on experience with CI/CD, Git, and version control systems
  • Familiarity with containerization (Docker, ECS, or EKS) is a plus
  • Strong problem-solving and communication skills

Preferred Qualifications:

  • Experience with PySpark, Pandas, or data engineering tools
  • Working knowledge of Django, Flask, or other Python frameworks
  • AWS Certification (e.g., AWS Certified Developer – Associate) is a plus

Educational Qualification:

  • Bachelor's or Master’s degree in Computer Science, Engineering, or related field


Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹10L - ₹25L / yr
Microsoft Windows Azure
Data engineering
skill iconPython
Apache Kafka

Role Overview:

We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.

The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.

Key Responsibilities:

  • Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
  • Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
  • Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
  • Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
  • Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
  • Mentor junior engineers, perform code reviews, and promote engineering best practices.
  • Stay current with evolving technologies in cloud, big data, and healthcare data standards.
  • Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).

Required Skills & Qualifications:

  • 4+ years of hands-on experience in data engineering roles.
  • Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
  • Proficient in Python for data processing and automation.
  • Experience with Azure Databricks (or readiness to ramp up quickly).
  • Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
  • Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
  • Familiarity with containerization tools like Docker and orchestration using Kubernetes.
  • Exposure to CI/CD pipelines for data applications.
  • Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
  • Excellent problem-solving abilities and a proactive mindset.
  • Strong communication and interpersonal skills to work in cross-functional teams.


Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Noida, Bengaluru (Bangalore), Pune
6 - 9 yrs
₹10L - ₹18L / yr
Windows Azure
SQL Azure
SQL
Data Warehouse (DWH)
skill iconData Analytics
+3 more

Hybrid work mode


(Azure) EDW Experience working in loading Star schema data warehouses using framework

architectures including experience loading type 2 dimensions. Ingesting data from various

sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.

Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),

Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

Read more
Delhi, Noida
3 - 5 yrs
₹8L - ₹20L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Multi-Agent System
Full Stack Development
+17 more

We are building an advanced, AI-driven multi-agent software system designed to revolutionize task automation and code generation. This is a futuristic AI platform capable of:


✅ Real-time self-coding based on tasks  

✅ Autonomous multi-agent collaboration  

✅ AI-powered decision-making  

✅ Cross-platform compatibility (Desktop, Web, Mobile)  


We are hiring a highly skilled **AI Engineer & Full-Stack Developer** based in India, with a strong background in AI/ML, multi-agent architecture, and scalable, production-grade software development.


### Responsibilities:


- Build and maintain a multi-agent AI system (AutoGPT, BabyAGI, MetaGPT concepts)  

- Integrate large language models (GPT-4o, Claude, open-source LLMs)  

- Develop full-stack components (Backend: Python, FastAPI/Flask, Frontend: React/Next.js)  

- Work on real-time task execution pipelines  

- Build cross-platform apps using Electron or Flutter  

- Implement Redis, Vector databases, scalable APIs  

- Guide the architecture of autonomous, self-coding AI systems  


### Must-Have Skills:


- Python (advanced, AI applications)  

- AI/ML experience, including multi-agent orchestration  

- LLM integration knowledge  

- Full-stack development: React or Next.js  

- Redis, Vector Databases (e.g., Pinecone, FAISS)  

- Real-time applications (websockets, event-driven)  

- Cloud deployment (AWS, GCP)  


### Good to Have:


- Experience with code-generation AI models (Codex, GPT-4o coding abilities)  

- Microservices and secure system design  

- Knowledge of AI for workflow automation and productivity tools  


Join us to work on cutting-edge AI technology that builds the future of autonomous software.



Read more
Ceryneian Partners LLC
Mridu Srivastava
Posted by Mridu Srivastava
Remote, Noida
0 - 4 yrs
₹12L - ₹28L / yr
svelte
skill iconC++
Erlang
skill iconRust
skill iconPython
+2 more

About the Role

At Ceryneian, we’re building a next-generation, research-driven algorithmic trading platform aimed at democratizing access to hedge fund-grade financial analytics. Headquartered in California, Ceryneian is a fintech innovation company dedicated to empowering traders with sophisticated yet accessible tools for quantitative research, strategy development, and execution.

Our flagship platform is currently under development. As a Backend Engineer, you will play a foundational role in designing and building the core trading engine and research infrastructure from the ground up. Your work will focus on developing performance-critical components that power backtesting, real-time strategy execution, and seamless integration with brokers and data providers. You’ll be responsible for bridging core engine logic with Python-based strategy interfaces, supporting a modular system architecture for isolated and scalable strategy execution, and building robust abstractions for data handling and API interactions. This role is central to delivering the reliability, flexibility, and performance that our users will rely on in fast-moving financial markets.

We are a remote-first team and are open to hiring exceptional candidates globally.

Core Tasks

·      Build and maintain the trading engine core for execution, backtesting, and event logging.

·      Develop isolated strategy execution runners to support multi-user, multi-strategy environments.

·      Implement abstraction layers for brokers and market data feeds to offer a unified API experience.

·      Bridge the core engine language with Python strategies using gRPC, ZeroMQ, or similar interop technologies.

·      Implement logic to parse and execute JSON-based strategy DSL from the strategy builder.

·      Design compute-optimized components for multi-asset workflows and scalable backtesting.

·      Capture real-time state, performance metrics, and slippage for both live and simulated runs.

·      Collaborate with infrastructure engineers to support high-availability deployments.

Top Technical Competencies

·      Proficiency in distributed systems, concurrency, and system design.

·      Strong backend/server-side development skills using C++, Rust, C#, Erlang, or Python.

·      Deep understanding of data structures and algorithms with a focus on low-latency performance.

·      Experience with event-driven and messaging-based architectures (e.g., ZeroMQ, Redis Streams).

·      Familiarity with Linux-based environments and system-level performance tuning.

 

Bonus Competencies

·      Understanding of financial markets, asset classes, and algorithmic trading strategies.

·      3–5 years of prior Backend experience.

·      Hands-on experience with backtesting frameworks or financial market simulators.

·      Experience with sandboxed execution environments or paper trading platforms.

·      Advanced knowledge of multithreading, memory optimization, or compiler construction.

·      Educational background from Tier-I or Tier-II institutions with strong computer science fundamentals, a passion for scalable system design, and a drive to build cutting-edge fintech infrastructure.

What We Offer

·      Opportunity to shape the backend architecture of a next-gen fintech startup.

·      A collaborative, technically driven culture.

·      Competitive compensation with performance-based bonuses.

·      Flexible working hours and a remote-friendly environment for candidates across the globe.

·      Exposure to financial modeling, trading infrastructure, and real-time applications.

·      Collaboration with a world-class team from Pomona, UCLA, Harvey Mudd, and Claremont McKenna.

Ideal Candidate

You’re a backend-first thinker who’s obsessed with reliability, latency, and architectural flexibility. You enjoy building scalable systems that transform complex strategy logic into high-performance, real-time trading actions. You think in microseconds, architect for fault tolerance, and build APIs designed for developer extensibility.

 


Read more
Hyderabad, Bengaluru (Bangalore), Mumbai, Delhi, Pune, Chennai
0 - 1 yrs
₹10L - ₹20L / yr
skill iconPython
Object Oriented Programming (OOPs)
skill iconJavascript
skill iconJava
Data Structures
+1 more


About NxtWave


NxtWave is one of India’s fastest-growing ed-tech startups, reshaping the tech education landscape by bridging the gap between industry needs and student readiness. With prestigious recognitions such as Technology Pioneer 2024 by the World Economic Forum and Forbes India 30 Under 30, NxtWave’s impact continues to grow rapidly across India.

Our flagship on-campus initiative, NxtWave Institute of Advanced Technologies (NIAT), offers a cutting-edge 4-year Computer Science program designed to groom the next generation of tech leaders, located in Hyderabad’s global tech corridor.

Know more:

🌐 NxtWave | NIAT

About the Role

As a PhD-level Software Development Instructor, you will play a critical role in building India’s most advanced undergraduate tech education ecosystem. You’ll be mentoring bright young minds through a curriculum that fuses rigorous academic principles with real-world software engineering practices. This is a high-impact leadership role that combines teaching, mentorship, research alignment, and curriculum innovation.


Key Responsibilities

  • Deliver high-quality classroom instruction in programming, software engineering, and emerging technologies.
  • Integrate research-backed pedagogy and industry-relevant practices into classroom delivery.
  • Mentor students in academic, career, and project development goals.
  • Take ownership of curriculum planning, enhancement, and delivery aligned with academic and industry excellence.
  • Drive research-led content development, and contribute to innovation in teaching methodologies.
  • Support capstone projects, hackathons, and collaborative research opportunities with industry.
  • Foster a high-performance learning environment in classes of 70–100 students.
  • Collaborate with cross-functional teams for continuous student development and program quality.
  • Actively participate in faculty training, peer reviews, and academic audits.


Eligibility & Requirements

  • Ph.D. in Computer Science, IT, or a closely related field from a recognized university.
  • Strong academic and research orientation, preferably with publications or project contributions.
  • Prior experience in teaching/training/mentoring at the undergraduate/postgraduate level is preferred.
  • A deep commitment to education, student success, and continuous improvement.

Must-Have Skills

  • Expertise in Python, Java, JavaScript, and advanced programming paradigms.
  • Strong foundation in Data Structures, Algorithms, OOP, and Software Engineering principles.
  • Excellent communication, classroom delivery, and presentation skills.
  • Familiarity with academic content tools like Google Slides, Sheets, Docs.
  • Passion for educating, mentoring, and shaping future developers.

Good to Have

  • Industry experience or consulting background in software development or research-based roles.
  • Proficiency in version control systems (e.g., Git) and agile methodologies.
  • Understanding of AI/ML, Cloud Computing, DevOps, Web or Mobile Development.
  • A drive to innovate in teaching, curriculum design, and student engagement.

Why Join Us?

  • Be at the forefront of shaping India’s tech education revolution.
  • Work alongside IIT/IISc alumni, ex-Amazon engineers, and passionate educators.
  • Competitive compensation with strong growth potential.
  • Create impact at scale by mentoring hundreds of future-ready tech leaders.


Read more
 B2B Automation Platform

B2B Automation Platform

Agency job
via AccioJob by AccioJobHiring Board
Noida
0 - 1 yrs
₹4L - ₹5L / yr
DSA
skill iconPython
skill iconDjango
skill iconFlask

AccioJob is conducting an offline hiring drive with B2B Automation Platform for the position of SDE Trainee Python.


Link for registration: https://go.acciojob.com/6kT7Ea


Position: SDE Trainee Python – DSA, Python, Django/Flask


Eligibility Criteria:

  • Degree: B.Tech / BE / MCA
  • Branch: CS / IT
  • Work Location: Noida

Compensation:

  • CTC: ₹4 - ₹5 LPA
  • Service Agreement: 2-year commitment

Note:

Candidates must be available for face-to-face interviews in Noida and should be ready to join immediately.


Evaluation Process:

Round 1: Assessment at AccioJob Noida Skill Centre

Further Rounds (for shortlisted candidates):

  • Technical Interview 1
  • Technical Interview 2
  • Tech + Managerial Round (Face-to-Face)

Important:

Please bring your laptop for the assessment.


Link for registration: https://go.acciojob.com/6kT7Ea

Read more
TalentRep

TalentRep

Agency job
via TalentRep by Vrinda Makhija
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
7 - 12 yrs
₹30L - ₹50L / yr
Microsoft Windows Azure
skill iconAmazon Web Services (AWS)
Golang
skill iconPython
skill iconJavascript
+8 more

A fast-growing, tech-driven loyalty programs and benefits business is looking to hire a Technical Architect with expertise in:


Key Responsibilities:


1. Architectural Design & Governance

• Define, document, and maintain the technical architecture for projects and product modules.

• Ensure architectural decisions meet scalability, performance, and security requirements.


2. Solution Development & Technical Leadership

• Translate product and client requirements into robust technical solutions, balancing short-term deliverables with long-term product viability.

• Oversee system integrations, ensuring best practices in coding standards, security, and performance optimization.


3. Collaboration & Alignment

• Work closely with Product Managers and Project Managers to prioritize and plan feature development.

• Facilitate cross-team communication to ensure technical feasibility and timely execution of features or client deliverables.


4. Mentorship & Code Quality

• Provide guidance to senior developers and junior engineers through code reviews, design reviews, and technical coaching.

• Advocate for best-in-class engineering practices, encouraging the use of CI/CD, automated testing, and modern development tooling.5. Risk Management & Innovation

• Proactively identify technical risks or bottlenecks, proposing mitigation strategies.

• Investigate and recommend new technologies, frameworks, or tools that enhance product capabilities and developer productivity.


6. Documentation & Standards

• Maintain architecture blueprints, design patterns, and relevant documentation to align the team on shared standards.

• Contribute to the continuous improvement of internal processes, ensuring streamlined development and deployment workflows.


Skills:


1. Technical Expertise

7–10 years of overall experience in software development with at least a couple of years in senior or lead roles.

• Strong proficiency in at least one mainstream programming language (e.g., Golang,

Python, JavaScript).

• Hands-on experience with architectural patterns (microservices, monolithic systems, event-driven architectures).

• Good understanding of Cloud Platforms (AWS, Azure, or GCP) and DevOps practices

(CI/CD pipelines, containerization with Docker/Kubernetes).

• Familiarity with relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB).


Location: Saket, Delhi (Work from Office)

Schedule: Monday – Friday

Experience : 7-10 Years

Compensation: As per industry standards

Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Tata Consultancy Services
Hyderabad, Bengaluru (Bangalore), Chennai, Pune, Noida, Gurugram, Mumbai, Kolkata
5 - 8 yrs
₹7L - ₹20L / yr
Snowflake
skill iconPython
SQL Azure
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)

5+ years of IT development experience with min 3+ years hands-on experience in Snowflake · Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. · Strong experience with building productionized data ingestion and data pipelines in Snowflake · Good knowledge of Snowflake's architecture, features likie Zero-Copy Cloning, Time Travel, and performance tuning capabilities · Should have good exp on Snowflake RBAC and data security. · Strong experience in Snowflake features including new snowflake features. · Should have good experience in Python/Pyspark. · Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) · Should have experience/knowledge in orchestration and scheduling tools experience like Airflow · Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Bengaluru (Bangalore), Mumbai, Gurugram, Noida, Pune, Chennai, Nagpur, Indore, Ahmedabad, Kochi (Cochin), Delhi
3.5 - 8 yrs
₹4L - ₹15L / yr
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
skill iconPython

Role Overview:


We are looking for a skilled Golang Developer with 3.5+ years of experience in building scalable backend services and deploying cloud-native applications using AWS. This is a key position that requires a deep understanding of Golang and cloud infrastructure to help us build robust solutions for global clients.


Key Responsibilities:

  • Design and develop backend services, APIs, and microservices using Golang.
  • Build and deploy cloud-native applications on AWS using services like Lambda, EC2, S3, RDS, and more.
  • Optimize application performance, scalability, and reliability.
  • Collaborate closely with frontend, DevOps, and product teams.
  • Write clean, maintainable code and participate in code reviews.
  • Implement best practices in security, performance, and cloud architecture.
  • Contribute to CI/CD pipelines and automated deployment processes.
  • Debug and resolve technical issues across the stack.


Required Skills & Qualifications:

  • 3.5+ years of hands-on experience with Golang development.
  • Strong experience with AWS services such as EC2, Lambda, S3, RDS, DynamoDB, CloudWatch, etc.
  • Proficient in developing and consuming RESTful APIs.
  • Familiar with Docker, Kubernetes or AWS ECS for container orchestration.
  • Experience with Infrastructure as Code (Terraform, CloudFormation) is a plus.
  • Good understanding of microservices architecture and distributed systems.
  • Experience with monitoring tools like Prometheus, Grafana, or ELK Stack.
  • Familiarity with Git, CI/CD pipelines, and agile workflows.
  • Strong problem-solving, debugging, and communication skills.


Nice to Have:

  • Experience with serverless applications and architecture (AWS Lambda, API Gateway, etc.)
  • Exposure to NoSQL databases like DynamoDB or MongoDB.
  • Contributions to open-source Golang projects or an active GitHub portfolio.


Read more
hirezyai
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 10 yrs
₹12L - ₹25L / yr
AgaroCD
skill iconKubernetes
skill iconDocker
helm
Terraform
+9 more

Job Summary:

We are seeking a skilled DevOps Engineer to design, implement, and manage CI/CD pipelines, containerized environments, and infrastructure automation. The ideal candidate should have hands-on experience with ArgoCD, Kubernetes, and Docker, along with a deep understanding of cloud platforms and deployment strategies.

Key Responsibilities:

  • CI/CD Implementation: Develop, maintain, and optimize CI/CD pipelines using ArgoCD, GitOps, and other automation tools.
  • Container Orchestration: Deploy, manage, and troubleshoot containerized applications using Kubernetes and Docker.
  • Infrastructure as Code (IaC): Automate infrastructure provisioning with Terraform, Helm, or Ansible.
  • Monitoring & Logging: Implement and maintain observability tools like Prometheus, Grafana, ELK, or Loki.
  • Security & Compliance: Ensure best security practices in containerized and cloud-native environments.
  • Cloud & Automation: Manage cloud infrastructure on AWS, Azure, or GCP with automated deployments.
  • Collaboration: Work closely with development teams to optimize deployments and performance.

Required Skills & Qualifications:

  • Experience: 5+ years in DevOps, Site Reliability Engineering (SRE), or Infrastructure Engineering.
  • Tools & Tech: Strong knowledge of ArgoCD, Kubernetes, Docker, Helm, Terraform, and CI/CD pipelines.
  • Cloud Platforms: Experience with AWS, GCP, or Azure.
  • Programming & Scripting: Proficiency in Python, Bash, or Go.
  • Version Control: Hands-on with Git and GitOps workflows.
  • Networking & Security: Knowledge of ingress controllers, service mesh (Istio/Linkerd), and container security best practices.

Nice to Have:

  • Experience with Kubernetes Operators, Kustomize, or FluxCD.
  • Exposure to serverless architectures and multi-cloud deployments.
  • Certifications in CKA, AWS DevOps, or similar.


Read more
Partner Company

Partner Company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹5L - ₹6L / yr
SQL
MS-Excel
PowerBI
skill iconPython

AccioJob is conducting an offline hiring drive in partnership with Our Partner Company to hire Junior Business/Data Analysts for an internship with a Pre-Placement Offer (PPO) opportunity.


Apply, Register and select your Slot here: https://go.acciojob.com/69d3Wd


Job Description:

  • Role: Junior Business/Data Analyst (Internship + PPO)
  • Work Location: Hyderabad
  • Internship Stipend: 15,000 - 25,000/month
  • Internship Duration: 3 months
  • CTC on PPO: 5 LPA - 6 LPA

Eligibility Criteria:

  • Degree: Open to all academic backgrounds
  • Graduation Year: 2023, 2024, 2025

Required Skills:

  • Proficiency in SQLExcelPower BI, and basic Python
  • Strong analytical mindset and interest in solving business problems with data

Hiring Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Pune, Noida)
  2. 1 Assignment + 2 Technical Interviews (Virtual; In-person for Hyderabad candidates)

Note: Please bring your laptop and earphones for the test.


Register Here: https://go.acciojob.com/69d3Wd

Read more
Gameberry

at Gameberry

5 recruiters
Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 1 yrs
₹10L - ₹15L / yr
DSA
Object Oriented Programming (OOPs)
skill iconJava
skill iconPython
skill iconGo Programming (Golang)

AccioJob is organizing an exclusive offline hiring drive in collaboration with GameBerry Labs for the role of Software Development Engineer 1 (SDE 1).


To Apply, Register and select your Slot here: https://go.acciojob.com/Zq2UnA


Job Description:

  • Role: SDE 1
  • Work Location: Bangalore
  • CTC: 10 LPA - 15 LPA

Eligibility Criteria:

  • Education: B.Tech, BE, BCA, MCA, M.Tech
  • Branches: Circuit Branches (CSE, ECE, IT, etc.)
  • Graduation Year:
  • 2024 (Minimum 9 months of experience)
  • 2025 (Minimum 3-6 months of experience)

Evaluation Process:

  1. Offline Assessment at AccioJob Skill Centres (Hyderabad, Bangalore, Pune, Noida)
  2. Technical Interviews (2 Rounds - Virtual for most; In-person for Bangalore candidates)

Note: Carry your laptop and earphones for the assessment.


Register Here: https://go.acciojob.com/Zq2UnA

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Gurugram, Delhi, Noida, Ghaziabad, Faridabad
6 - 10 yrs
₹5L - ₹15L / yr
Google Cloud Platform (GCP)
skill iconPython
PySpark
skill icon.NET
skill iconScala

🚀 Hiring: Data Engineer | GCP + Spark + Python + .NET |

| 6–10 Yrs | Gurugram (Hybrid)


We’re looking for a skilled Data Engineer with strong hands-on experience in GCP, Spark-Scala, Python, and .NET.


📍 Location: Suncity, Sector 54, Gurugram (Hybrid – 3 days onsite)

💼 Experience: 6–10 Years

⏱️ Notice Period :- Immediate Joiner


Required Skills:

  • 5+ years of experience in distributed computing (Spark) and software development.
  • 3+ years of experience in Spark-Scala
  • 5+ years of experience in Data Engineering.
  • 5+ years of experience in Python.
  • Fluency in working with databases (preferably Postgres).
  • Have a sound understanding of object-oriented programming and development principles.
  • Experience working in an Agile Scrum or Kanban development environment.
  • Experience working with version control software (preferably Git).
  • Experience with CI/CD pipelines.
  • Experience with automated testing, including integration/delta, Load, and Performance
Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 9 yrs
₹40L - ₹60L / yr
skill iconPython
SQL
Data engineering
Snowflake
ETL
+5 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 

 

 

Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
Computer Networking
Linux administration
skill iconPython
Bash
Object Oriented Programming (OOPs)
+2 more

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Infrastructure Engineer, specifically for female candidates.


To Apply, Register and select your Slot herehttps://go.acciojob.com/kcYTAp


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Linux, Networking, One scripting language among Python, Bash, and PowerShell, OOPs, Cloud Platforms (AWS, Azure)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE Core With Cloud Certification
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/kcYTAp

Read more
Top tier global IT consulting company

Top tier global IT consulting company

Agency job
via AccioJob by AccioJobHiring Board
Hyderabad, Pune, Noida
0 - 0 yrs
₹11L - ₹11L / yr
skill iconPython
MySQL
Big Data

AccioJob is conducting a Walk-In Hiring Drive with a reputed global IT consulting company at AccioJob Skill Centres for the position of Data Engineer, specifically for female candidates.


To Apply, Register and select your Slot here: https://go.acciojob.com/8p9ZXN


We will not consider your application if you do not register and select slot via the above link.


Required Skills: Python, Database(MYSQL), Big Data(Spark, Kafka)


Eligibility:


  • Degree: B.Tech/BE
  • Branch: CSE – AI & DS / AI & ML
  • Graduation Year: 2024 & 2025


Note: Only Female Candidates can apply for this job opportunity


Work Details:


  • Work Mode: Work From Office
  • Work Location: Bangalore & Coimbatore
  • CTC: 11.1 LPA


Evaluation Process:


  • Round 1: Offline Assessment at AccioJob Skill Centre in Noida, Pune, Hyderabad.


  • Further Rounds (for Shortlisted Candidates only)

 

  1. HackerRank Online Assessment
  2. Coding Pairing Interview
  3. Technical Interview
  4. Cultural Alignment Interview


Important Note: Please bring your laptop and earphones for the test.


Register here: https://go.acciojob.com/8p9ZXN

Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 8 yrs
₹25L - ₹40L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
ETL
+6 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, PAn india
5 - 10 yrs
₹10L - ₹25L / yr
Test Automation
Selenium
skill iconJava
skill iconPython
skill iconJavascript

Test Automation Engineer Job Description

A Test Automation Engineer is responsible for designing, developing, and implementing automated testing solutions to ensure the quality and reliability of software applications. Here's a breakdown of the job:


Key Responsibilities

- Test Automation Framework: Design and develop test automation frameworks using tools like Selenium, Appium, or Cucumber.

- Automated Test Scripts: Create and maintain automated test scripts to validate software functionality, performance, and security.

- Test Data Management: Develop and manage test data, including data generation, masking, and provisioning.

- Test Environment: Set up and maintain test environments, including configuration and troubleshooting.

- Collaboration: Work with cross-functional teams, including development, QA, and DevOps to ensure seamless integration of automated testing.


Essential Skills

- Programming Languages: Proficiency in programming languages like Java, Python, or C#.

- Test Automation Tools: Experience with test automation tools like Selenium,.

- Testing Frameworks: Knowledge of testing frameworks like TestNG, JUnit, or PyUnit.

- Agile Methodologies: Familiarity with Agile development methodologies and CI/CD pipelines.

Read more
MNC in B2B Insurance Domain

MNC in B2B Insurance Domain

Agency job
via Bean HR Consulting by Sachin Bhandari
Noida, Gurugram
13 - 20 yrs
₹30L - ₹40L / yr
skill iconPython
Architecture
skill iconDjango
skill iconFlask

Job Description:

Position: Python Technical Architect

 

Major Responsibilities:

 

●           Develop and customize solutions, including workflows, Workviews, and application integrations.

●           Integrate with other enterprise applications and systems.

●           Perform system upgrades and migrations to ensure optimal performance.

●           Troubleshoot and resolve issues related to applications and workflows using Diagnostic console.

●           Ensure data integrity and security within the system.

●           Maintain documentation for system configurations, workflows, and processes.

●           Stay updated on best practices, new features and industry trends.

●           Hands-on in Waterfall & Agile Scrum methodology.

●           Working on software issues and specifications and performing Design/Code Review(s).

●           Engaging in the assignment of work to the development team resources, ensuring effective transition of knowledge, design assumptions and development expectations.

●           Ability to mentor developers and lead cross-functional technical teams.

●           Collaborate with stakeholders to gather requirements and translate them into technical specifications for effective workflow/Workview design.

●           Assist in the training of end-users and provide support as needed

●           Contributing to the organizational values by actively working with agile development teams, methodologies, and toolsets.

●           Driving concise, structured, and effective communication with peers and clients.

 

Key Capabilities and Competencies Knowledge

 

●           Proven experience as a Software Architect or Technical Project Manager with architectural responsibilities.

●           Strong proficiency in Python and relevant frameworks (Django, Flask, FastAPI).

●           Strong understanding of software development lifecycle (SDLC), agile methodologies (Scrum, Kanban) and DevOps practices.

●           Expertise in Azure cloud ecosystem and architecture design patterns.

●           Familiarity with Azure DevOps, CI/CD pipelines, monitoring and logging.

●           Experience with RESTful APIs, microservices architecture and asynchronous processing.

●           Deep understanding of insurance domain processes such as claims management, policy administration etc.

●           Experience in database design and data modelling with SQL(MySQL) and NoSQL(Azure Cosmos DB).

●           Knowledge of security best practices including data encryption, API security and compliance standards.

●           Knowledge of SAST and DAST security tools is a plus.

●           Strong documentation skill for articulating architecture decisions and technical concepts to stakeholders.

●           Experience with system integration using middleware or web services.

●           Server Load Balancing, Planning, configuration, maintenance and administration of the Server Systems.

●           Experience with developing reusable assets such as prototypes, solution designs, documentation and other materials that contribute to department efficiency.

●           Highly cognizant of the DevOps approach like ensuring basic security measures.

●           Technical writing skills, strong networking, and communication style with the ability to formulate professional emails, presentations, and documents.

●           Passion for technology trends in the insurance industry and emerging technology space.

 

 

Qualification and Experience

 

●           Recognized with a Bachelor’s degree in Computer Science, Information Technology, or equivalent.

●           Work experience - Overall experience 10-12 years

●           Recognizable domain knowledge and awareness of basic insurance and regulatory frameworks.

●           Previous experience working in the insurance industry (AINS Certification is a plus).

Read more
NeoGenCode Technologies Pvt Ltd
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 6 yrs
₹2L - ₹6L / yr
skill iconNodeJS (Node.js)
skill iconPython
skill iconDjango
OAuth
RESTful APIs
+2 more

Job Title : Backend Developer (Node.js or Python/Django)

Experience : 2 to 5 Years

Location : Connaught Place, Delhi (Work From Office)


Job Summary :

We are looking for a skilled and motivated Backend Developer (Node.js or Python/Django) to join our in-house engineering team.


Key Responsibilities :

  • Design, develop, test, and maintain robust backend systems using Node.js or Python/Django.
  • Build and integrate RESTful APIs including third-party Authentication APIs (OAuth, JWT, etc.).
  • Work with data stores like Redis and Elasticsearch to support caching and search features.
  • Collaborate with frontend developers, product managers, and QA teams to deliver complete solutions.
  • Ensure code quality, maintainability, and performance optimization.
  • Write clean, scalable, and well-documented code.
  • Participate in code reviews and contribute to team best practices.

Required Skills :

  • 2 to 5 Years of hands-on experience in backend development.
  • Proficiency in Node.js and/or Python (Django framework).
  • Solid understanding and experience with Authentication APIs.
  • Experience with Redis and Elasticsearch for caching and full-text search.
  • Strong knowledge of REST API design and best practices.
  • Experience working with relational and/or NoSQL databases.
  • Must have completed at least 2 end-to-end backend projects.

Nice to Have :

  • Experience with Docker or containerized environments.
  • Familiarity with CI/CD pipelines and DevOps workflows.
  • Exposure to cloud platforms like AWS, GCP, or Azure.
Read more
Enalytix

at Enalytix

1 video
1 recruiter
Renu Pandey
Posted by Renu Pandey
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
2 - 6 yrs
₹5L - ₹8L / yr
skill iconMachine Learning (ML)
Compter Vision
Artificial Intelligence (AI)
OpenCV
pill
+1 more

🚀 We’re Hiring! | AI/ML Engineer – Computer Vision

📍 Location: Noida | 🕘 Full-Time


🔍 What We’re Looking For:

• 4+ years in AI/ML (Computer Vision)

• Python, OpenCV, TensorFlow, PyTorch, etc.

• Hands-on with object detection, face recognition, classification

• Git, Docker, Linux experience

• Curious, driven, and ready to build impactful products

💡 Be part of a fast-growing team, build products used by brands like Biba, Zivame, Costa Coffee & more!

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Pune, Gurugram, Noida, Bhopal, Bengaluru (Bangalore)
4 - 8 yrs
₹8L - ₹22L / yr
MLOps
skill iconAmazon Web Services (AWS)
AWS Sagemaker
skill iconPython

Role - MLops Engineer

Location - Pune, Gurgaon, Noida, Bhopal, Bangalore 

Mode - Hybrid


Role Overview

We are looking for an experienced MLOps Engineer to join our growing AI/ML team. You will be responsible for automating, monitoring, and managing machine learning workflows and infrastructure in production environments. This role is key to ensuring our AI solutions are scalable, reliable, and continuously improving.


Key Responsibilities

  • Design, build, and manage end-to-end ML pipelines, including model training, validation, deployment, and monitoring.
  • Collaborate with data scientists, software engineers, and DevOps teams to integrate ML models into production systems.
  • Develop and manage scalable infrastructure using AWS, particularly AWS Sagemaker.
  • Automate ML workflows using CI/CD best practices and tools.
  • Ensure model reproducibility, governance, and performance tracking.
  • Monitor deployed models for data drift, model decay, and performance metrics.
  • Implement robust versioning and model registry systems.
  • Apply security, performance, and compliance best practices across ML systems.
  • Contribute to documentation, knowledge sharing, and continuous improvement of our MLOps capabilities.


Required Skills & Qualifications

  • 4+ years of experience in Software Engineering or MLOps, preferably in a production environment.
  • Proven experience with AWS services, especially AWS Sagemaker for model development and deployment.
  • Working knowledge of AWS DataZone (preferred).
  • Strong programming skills in Python, with exposure to R, Scala, or Apache Spark.
  • Experience with ML model lifecycle management, version control, containerization (Docker), and orchestration tools (e.g., Kubernetes).
  • Familiarity with MLflow, Airflow, or similar pipeline/orchestration tools.
  • Experience integrating ML systems into CI/CD workflows using tools like Jenkins, GitHub Actions, or AWS CodePipeline.
  • Solid understanding of DevOps and cloud-native infrastructure practices.
  • Excellent problem-solving skills and the ability to work collaboratively across teams.


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Hyderabad, Pune
4 - 10 yrs
₹10L - ₹24L / yr
skill iconJava
Artificial Intelligence (AI)
Automation
IDX
skill iconSpring Boot
+4 more

Job Title : Senior Backend Engineer – Java, AI & Automation

Experience : 4+ Years

Location : Any Cognizant location (India)

Work Mode : Hybrid

Interview Rounds :

  1. Virtual
  2. Face-to-Face (In-person)

Job Description :

Join our Backend Engineering team to design and maintain services on the Intuit Data Exchange (IDX) platform.

You'll work on scalable backend systems powering millions of daily transactions across Intuit products.


Key Qualifications :

  • 4+ years of backend development experience.
  • Strong in Java, Spring framework.
  • Experience with microservices, databases, and web applications.
  • Proficient in AWS and cloud-based systems.
  • Exposure to AI and automation tools (Workato preferred).
  • Python development experience.
  • Strong communication skills.
  • Comfortable with occasional US shift overlap.
Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹8L - ₹18L / yr
skill iconPython
skill iconReact.js
skill iconHTML/CSS

Role: Python Full Stack Developer with React

Hybrid: 2 days in a week (Noida, Bangalore, Chennai, Hyderabad)

Experience: 5+ Years

Contract Duration: 6 Months

Notice0 less than 15 days

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Bhopal, Jaipur
4 - 6 yrs
₹4L - ₹20L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
SageMaker
MLOps

Role - MLops Engineer

Required Experience - 4 Years

Location - Pune, Gurgaon, Noida, Bhopal, Bangalore 

Mode - Hybrid


Key Requirements:

  • 4+ years of experience in Software Engineering with MLOps focus
  • Strong expertise in AWS, particularly AWS SageMaker (required)
  • AWS Data Zone experience (preferred)
  • Proficiency in Python, R, Scala, or Spark
  • Experience developing scalable, reliable, and secure applications
  • Track record of production-grade development, integration and support


 

Read more
Qurilo Solutions Pvt Ltd
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 5 yrs
₹3L - ₹6L / yr
skill iconJavascript
skill iconReact.js
skill iconAngularJS (1.x)
skill iconNodeJS (Node.js)
skill iconMongoDB
+3 more

Job description

Required Skills & Qualifications: (Minimum Experience 3Years)

  • Proven experience as a Full Stack Developer or similar role.
  • Proficiency in front-end technologies such as HTML, CSS, JavaScript, and frameworks like React, Angular, or Vue.js.
  • Strong backend development experience with Node.js, Python, Ruby, Java, or .NET.
  • Familiarity with database technologies such as SQL, MongoDB, or PostgreSQL.
  • Experience with RESTful APIs and/or GraphQL.
  • Knowledge of cloud platforms (AWS, Azure, Google Cloud) is a plus.
  • Familiarity with version control tools, such as Git.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work independently as well as collaboratively in a team environment.
  • Strong communication and interpersonal skills for effective client interaction.
  • Proven ability to manage projects, prioritize tasks, and meet deadlines.

Job Type: Full-time

Pay: ₹40,000.00 - ₹60,000.00 per month

Location Type:

  • In-person

Schedule:

  • Fixed shift

Experience:

  • Full-stack development: 3 years (Required)
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Pune, Hyderabad, Indore, Jaipur, Kolkata
4 - 5 yrs
₹2L - ₹18L / yr
skill iconPython
PySpark

We are looking for a skilled and passionate Data Engineers with a strong foundation in Python programming and hands-on experience working with APIs, AWS cloud, and modern development practices. The ideal candidate will have a keen interest in building scalable backend systems and working with big data tools like PySpark.

Key Responsibilities:

  • Write clean, scalable, and efficient Python code.
  • Work with Python frameworks such as PySpark for data processing.
  • Design, develop, update, and maintain APIs (RESTful).
  • Deploy and manage code using GitHub CI/CD pipelines.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Work on AWS cloud services for application deployment and infrastructure.
  • Basic database design and interaction with MySQL or DynamoDB.
  • Debugging and troubleshooting application issues and performance bottlenecks.

Required Skills & Qualifications:

  • 4+ years of hands-on experience with Python development.
  • Proficient in Python basics with a strong problem-solving approach.
  • Experience with AWS Cloud services (EC2, Lambda, S3, etc.).
  • Good understanding of API development and integration.
  • Knowledge of GitHub and CI/CD workflows.
  • Experience in working with PySpark or similar big data frameworks.
  • Basic knowledge of MySQL or DynamoDB.
  • Excellent communication skills and a team-oriented mindset.

Nice to Have:

  • Experience in containerization (Docker/Kubernetes).
  • Familiarity with Agile/Scrum methodologies.


Read more
SaaS Spend Management Platform

SaaS Spend Management Platform

Agency job
via Recruiting Bond by Pavan Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 3 yrs
₹4L - ₹7L / yr
skill iconPython
skill iconReact.js
SQL
Fullstack Developer
Large Language Models (LLM)
+14 more

Requirement:

● Role: Fullstack Developer

● Location: Noida (Hybrid)

● Experience: 1-3 years

● Type: Full-Time


Role Description : We’re seeking a Fullstack Developer to join our fast-moving team at Velto. You’ll be responsible for building robust backend services and user-facing features using a modern tech stack. In this role, you’ll also get hands-on exposure to applied AI, contributing to the development of LLM-powered workflows, agentic systems, and custom fi ne-tuning pipelines.


Responsibilities:

● Develop and maintain backend services using Python and FastAPI

● Build interactive frontend components using React

● Work with SQL databases, design schema, and integrate data models with Python

● Integrate and build features on top of LLMs and agent frameworks (e.g., LangChain, OpenAI, HuggingFace)

● Contribute to AI fi ne-tuning pipelines, retrieval-augmented generation (RAG) setups, and contract intelligence workfl ows

● Profi ciency with unit testing libraries like jest, React testing library and pytest.

● Collaborate in agile sprints to deliver high-quality, testable, and scalable code

● Ensure end-to-end performance, security, and reliability of the stack


Required Skills:

● Proficient in Python and experienced with web frameworks like FastAPI

● Strong grasp of JavaScript and React for frontend development

● Solid understanding of SQL and relational database integration with Python

● Exposure to LLMs, vector databases, and AI-based applications (projects, internships, or coursework count)

● Familiar with Git, REST APIs, and modern software development practices

● Bachelor’s degree in Computer Science or equivalent fi eld


Nice to Have:

● Experience working with LangChain, RAG pipelines, or building agentic workfl ows

● Familiarity with containerization (Docker), basic DevOps, or cloud deployment

● Prior project or internship involving AI/ML, NLP, or SaaS products

Why Join Us?

● Work on real-world applications of AI in enterprise SaaS

● Fast-paced, early-stage startup culture with direct ownership

● Learn by doing—no layers, no red tape

● Hybrid work setup and merit-based growth



Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
1 - 7 yrs
₹4L - ₹12L / yr
skill iconPython
skill iconMongoDB
AWS RDS
skill iconReact.js
skill iconHTML/CSS
+2 more

Job Title: Full Stack Engineer

Location: Delhi-NCR

Type: Full-Time

Responsibilities

Frontend:

  • Develop responsive, intuitive interfaces using HTML, CSS (SASS), React, and Vanilla JS.
  • Implement real-time features using sockets for dynamic, interactive user experiences.
  • Collaborate with designers to ensure consistent UI/UX patterns and deliver visually compelling products.

Backend:

  • Design, implement, and maintain APIs using Python (FastAPI).
  • Integrate AI-driven features to enhance user experience and streamline processes.
  • Ensure the code adheres to best practices in performance, scalability, and security.
  • Troubleshoot and resolve production issues, minimizing downtime and improving reliability.

Database & Data Management:

  • Work with PostgreSQL for relational data, ensuring optimal queries and indexing.
  • Utilize ClickHouse or MongoDB where appropriate to handle specific data workloads and analytics needs.
  • Contribute to building dashboards and tools for analytics and reporting.
  • Leverage AI/ML concepts to derive insights from data and improve system performance.

General:

  • Use Git for version control; conduct code reviews, ensure clean commit history, and maintain robust documentation.
  • Collaborate with cross-functional teams to deliver features that align with business goals.
  • Stay updated with industry trends, particularly in AI and emerging frameworks, and apply them to enhance our platform.
  • Mentor junior engineers and contribute to continuous improvement in team processes and code quality.
Read more
OIP Insurtech

at OIP Insurtech

2 candid answers
Katarina Vasic
Posted by Katarina Vasic
Remote only
4 - 12 yrs
₹30L - ₹50L / yr
skill iconPython
Natural Language Processing (NLP)
Data extraction
OCR
Computer Vision
+3 more

We’re looking for a skilled Senior Machine Learning Engineer to help us transform the Insurtech space. You’ll build intelligent agents and models that read, reason, and act.


Insurance ops are broken. Underwriters drown in PDFs. Risk clearance is chaos. Emails go in circles. We’ve lived it – and we’re fixing it. Bound AI is building agentic AI workflows that go beyond chat. We orchestrate intelligent agents to handle policy operations end-to-end:


• Risk clearance.

• SOV ingestion.

• Loss run summarization.

• Policy issuance.

• Risk triage.


No hallucinations. No handwaving. Just real-world AI that executes – in production, at scale.


Join us to help shape the future of insurance through advanced technology!


We’re Looking For:


  • Deep experience in GenAI, LLM fine-tuning, and multi-agent orchestration (LangChain, DSPy, or similar).
  • 5+ years of proven experience in the field
  • Strong ML/AI engineering background in both foundational modeling (NLP, transformers, RAG) and traditional ML.
  • Solid Python engineering chops – you write production-ready code, not just notebooks.
  • A startup mindset – curiosity, speed, and obsession with shipping things that matter.
  • Bonus – Experience with insurance or document intelligence (SOVs, Loss Runs, ACORDs).


What You’ll Be Doing:


  • Develop foundation-model-based pipelines to read and understand insurance documents.
  • Develop GenAI agents that handle real-time decision-making and workflow orchestration, and modular, composable agent architectures that interact with humans, APIs, and other agents.
  • Work on auto-adaptive workflows that optimize around data quality, context, and risk signals.



Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Eman Khan
Posted by Eman Khan
Remote, Delhi
6 - 9 yrs
Best in industry
skill iconPython
skill iconJava
skill iconJavascript
Locust
Gatling
+14 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website - https://www.gohighlevel.com/

YouTube Channel - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post - https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

HighLevel Inc. is looking for a Lead SDET with 8-10 years of experience to play a pivotal role in ensuring the quality, performance, and scalability of our products. We are seeking engineers who thrive in a fast-paced startup environment, enjoy problem-solving, and stay updated with the latest models and solutions. This is an exciting opportunity to work on cutting-edge performance testing strategies and drive impactful initiatives across the organisation.


Responsibilities:

  • Implement performance, scalability, and reliability testing strategies
  • Capture and analyze key performance metrics to identify bottlenecks
  • Work closely with development, DevOps, and infrastructure teams to optimize system performance
  • Review application architecture and suggest improvements to enhance scalability
  • Leverage AI at appropriate layers to improve efficiency and drive positive business outcomes
  • Drive performance testing initiatives across the organization and ensure seamless execution
  • Automate the capturing of performance metrics and generate performance trend reports
  • Research, evaluate, and conduct PoCs for new tools and solutions
  • Collaborate with developers and architects to enhance frontend and API performance
  • Conduct root cause analysis of performance issues using logs and monitoring tools
  • Ensure high availability and reliability of applications and services


Requirements:

  • 6-9 years of hands-on experience in Performance, Reliability, and Scalability testing
  • Strong skills in capturing, analyzing, and optimizing performance metrics
  • Expertise in performance testing tools such as Locust, Gatling, k6, etc.
  • Experience working with cloud platforms (Google Cloud, AWS, Azure) and setting up performance testing environments
  • Knowledge of CI/CD deployments and integrating performance testing into pipelines
  • Proficiency in scripting languages (Python, Java, JavaScript) for test automation
  • Hands-on experience with monitoring and observability tools (New Relic, AppDynamics, Prometheus, etc.)
  • Strong knowledge of JVM monitoring, thread analysis, and RESTful services
  • Experience in optimising frontend performance and API performance
  • Ability to deploy applications in Kubernetes and troubleshoot environment issues
  • Excellent problem-solving skills and the ability to troubleshoot customer issues effectively
  • Experience in increasing application/service availability from 99.9% (three 9s) to 99.99% or higher (four/five 9s)


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Eman Khan
Posted by Eman Khan
Remote, Delhi
4 - 7 yrs
Best in industry
skill iconPython
skill iconJava
Locust
Gatling
K6
+10 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website: https://www.gohighlevel.com/

YouTube Channel: https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post: https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

HighLevel Inc. is looking for a SDET III with 5-6 years of experience to play a crucial role in ensuring the quality, performance, and scalability of our products. We are seeking engineers who thrive in a fast-paced startup environment, enjoy problem-solving, and stay updated with the latest models and solutions. This is a great opportunity to work on cutting-edge performance testing strategies and contribute to the success of our products.


Responsibilities:

  • Implement performance, scalability, and reliability testing strategies
  • Capture and analyze key performance metrics to identify bottlenecks
  • Work closely with development, DevOps, and infrastructure teams to optimize system performance
  • Develop test strategies based on customer behavior to ensure high-performing applications
  • Automate the capturing of performance metrics and generate performance trend reports
  • Collaborate with developers and architects to optimize frontend and API performance
  • Conduct root cause analysis of performance issues using logs and monitoring tools
  • Research, evaluate, and conduct PoCs for new tools and solutions
  • Ensure high availability and reliability of applications and services


Requirements:

  • 4-7 years of hands-on experience in Performance, Reliability, and Scalability testing
  • Strong skills in capturing, analyzing, and optimizing performance metrics
  • Expertise in performance testing tools such as Locust, Gatling, k6, etc.
  • Experience working with cloud platforms (Google Cloud, AWS, Azure) and setting up performance testing environments
  • Knowledge of CI/CD deployments and integrating performance testing into pipelines
  • Proficiency in scripting languages (Python, Java, JavaScript) for test automation
  • Hands-on experience with monitoring and observability tools (New Relic, AppDynamics, Prometheus, etc.)
  • Strong knowledge of JVM monitoring, thread analysis, and RESTful services
  • Experience in optimizing frontend performance and API performance
  • Ability to deploy applications in Kubernetes and troubleshoot environment issues
  • Excellent problem-solving skills and the ability to troubleshoot customer issues effectively


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
Timble Technologies

at Timble Technologies

1 recruiter
Preeti Bisht
Posted by Preeti Bisht
Arjan Garh, Gurgaon
2 - 5 yrs
₹3L - ₹9L / yr
skill iconPython
skill iconDjango
ORM
RESTful APIs
FastAPI
+1 more


Job Title: L3 SDE (Python- Django)

Location: Arjan Garh, MG Road, Gurgaon

Job Type: Full-time, On site

Company: Timble technologies Pvt Ltd. (www.timbleglance.com)

Pay Range: 30K- 70K

**IMMEDIATE JOINERS REQUIRED**


About Us:

Our Aim is to develop ‘More Data, More Opportunities’. We take pride in building a cutting-edge AI solutions to help financial institutions mitigate risk and generate comprehensive data. Elevate Your Business's Credibility with Timble Glance's Verification and Authentication Solutions.


Responsibilities

• Writing and testing code, debugging programs, and integrating applications with third-party web services. To be successful in this role, you should have experience using server-side logic and work well in a team. Ultimately, you’ll build highly responsive web applications that align with our client’s business needs

• Write effective, scalable code

• Develop back-end components to improve responsiveness and overall performance

• Integrate user-facing elements into applications

• Improve functionality of existing systems

• Implement security and data protection solutions

• Assess and prioritize feature requests

• Coordinate with internal teams to understand user requirements and provide technical solutions

• Creates customized applications for smaller tasks to enhance website capability based on business needs

• Builds table frames and forms and writes script within the browser to enhance site functionality

• Ensures web pages are functional across different browser types; conducts tests to verify user functionality

• Verifies compliance with accessibility standards

• Assists in resolving moderately complex production support problems


Profile Requirements

* 2 years or more experience as a Python Developer

* Expertise in at least one popular Python framework required Django

* Knowledge of object-relational mapping (ORM)

* Familiarity with front-end technologies like JavaScript, HTML5, and CSS3

* Familiarity with event-driven programming in Python

* Good understanding of the operating system and networking concepts.

* Good analytical and troubleshooting skills

* Graduation/Post Graduation in Computer Science / IT / Software Engineering

* Decent verbal and written communication skills to communicate with customers, support personnel, and management


**IMMEDIATE JOINERS REQUIRED**





Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Bengaluru (Bangalore), Pune, Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Navi Mumbai, Kolkata, Rajasthan
5 - 24 yrs
₹9L - ₹70L / yr
skill iconC
skill iconC++
Visual C++
Embedded C++
Artificial Intelligence (AI)
+32 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-fri role, In office, with excellent perks and benefits!


Position Overview

We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9


Key Responsibilities:

1. System Architecture & Design

● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.

● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.

● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.


2. Perception & AI Integration

● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.

● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.

● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.


3. Embedded & Real-Time Systems

● Design high-performance embedded software stacks for real-time robotic control and autonomy.

● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.

● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.


4. Robotics Simulation & Digital Twins

● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.

● Leverage synthetic data generation (Omniverse Replicator) for training AI models.

● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.


5. Navigation & Motion Planning

● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.

● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.

● Implement reinforcement learning-based policies using Isaac Gym.


6. Performance Optimization & Scalability

● Ensure low-latency AI inference and real-time execution of robotics applications.

● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.

● Develop benchmarking and profiling tools to measure software performance on edge AI devices.


Required Qualifications:

● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.

● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.

● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.

● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.

● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.

● Strong background in robotic perception, planning, and real-time control.

● Experience with cloud-edge AI deployment and scalable architectures.


Preferred Qualifications

● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym

● Knowledge of robot kinematics, control systems, and reinforcement learning

● Expertise in distributed computing, containerization (Docker), and cloud robotics

● Familiarity with automotive, industrial automation, or warehouse robotics

● Experience designing architectures for autonomous systems or multi-robot systems.

● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics

● Experience with microservices or service-oriented architecture (SOA)

● Knowledge of machine learning and AI integration within robotic systems

● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)

Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai, Navi Mumbai
5 - 40 yrs
₹8.5L - ₹75L / yr
Microservices
Architecture
API
NOSQL Databases
skill iconMongoDB
+33 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-Fri, In office role with excellent perks and benefits!


Key Responsibilities:

1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.

2. Build and implement scalable and robust microservices and integrate API gateways.

3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).

4. Implement real-time data pipelines using Kafka.

5. Collaborate with front-end developers to ensure seamless integration of backend services.

6. Write clean, reusable, and efficient code following best practices, including design patterns.

7. Troubleshoot, debug, and enhance existing systems for improved performance.


Mandatory Skills:

1. Proficiency in at least one backend technology: Node.js or Python, or Java.


2. Strong experience in:

i. Microservices architecture,

ii. API gateways,

iii. NoSQL databases (e.g., MongoDB, DynamoDB),

iv. Kafka

v. Data structures (e.g., arrays, linked lists, trees).


3. Frameworks:

i. If Java : Spring framework for backend development.

ii. If Python: FastAPI/Django frameworks for AI applications.

iii. If Node: Express.js for Node.js development.


Good to Have Skills:

1. Experience with Kubernetes for container orchestration.

2. Familiarity with in-memory databases like Redis or Memcached.

3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.

Read more
Variyas Labs Pvt. Ltd.
greater noida
1 - 3 yrs
₹4L - ₹7L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconPython
skill iconHTML/CSS
  • 01+ years of experience as a Full Stack Developer using Node.js and React.js.
  • A strong sense of ownership—you care about business impact, not just code.
  • Experience working in a fast-paced, high-growth environment.
  • Exceptional communication skills in both formal and informal settings.
  • A team player with a strong work ethic, who’s in it for the long run.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Noida
4 - 8 yrs
₹2L - ₹10L / yr
skill iconMachine Learning (ML)
skill iconData Science
Azure OpenAI
skill iconPython
pandas
+11 more

Job Title : Sr. Data Scientist

Experience : 5+ Years

Location : Noida (Hybrid – 3 Days in Office)

Shift Timing : 2 PM to 11 PM

Availability : Immediate


Job Description :

We are seeking a Senior Data Scientist to develop and implement machine learning models, predictive analytics, and data-driven solutions.

The role involves data analysis, dashboard development (Looker Studio), NLP, Generative AI (LLMs, Prompt Engineering), and statistical modeling.

Strong expertise in Python (Pandas, NumPy), Cloud Data Science (AWS SageMaker, Azure OpenAI), Agile (Jira, Confluence), and stakeholder collaboration is essential.


Mandatory skills : Machine Learning, Cloud Data Science (AWS SageMaker, Azure OpenAI), Python (Pandas, NumPy), Data Visualization (Looker Studio), NLP & Generative AI (LLMs, Prompt Engineering), Statistical Modeling, Agile (Jira, Confluence), and strong stakeholder communication.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort