Cutshort logo
Remote python jobs

50+ Remote Python Jobs in India

Apply to 50+ Remote Python Jobs on CutShort.io. Find your next job, effortlessly. Browse Python Jobs and apply today!

icon
Palcode.ai

at Palcode.ai

2 candid answers
Team Palcode
Posted by Team Palcode
Remote only
1 - 3 yrs
₹4L - ₹9L / yr
skill iconPython
FastAPI
skill iconAmazon Web Services (AWS)
API

At Palcode.ai, We're on a mission to fix the massive inefficiencies in pre-construction. Think about it - in a $10 trillion industry, estimators still spend weeks analyzing bids, project managers struggle with scattered data, and costly mistakes slip through complex contracts. We're fixing this with purpose-built AI agents that work. Our platform can do “magic” to Preconstruction workflows from Weeks to Hours. It's not just about AI – it's about bringing real, measurable impact to an industry ready for change. We are backed by names like AWS for Startups, Upekkha Accelerator, and Microsoft for Startups.



Why Palcode.ai


Tackle Complex Problems: Build AI that reads between the lines of construction bids, spots hidden risks in contracts, and makes sense of fragmented project data

High-Impact Code: Your code won't sit in a backlog – it goes straight to estimators and project managers who need it yesterday

Tech Challenges That Matter: Design systems that process thousands of construction documents, handle real-time pricing data, and make intelligent decisions

Build & Own: Shape our entire tech stack, from data processing pipelines to AI model deployment

Quick Impact: Small team, huge responsibility. Your solutions directly impact project decisions worth millions

Learn & Grow: Master the intersection of AI, cloud architecture, and construction tech while working with founders who've built and scaled construction software


Your Role:

  • Design and build our core AI services and APIs using Python
  • Create reliable, scalable backend systems that handle complex data
  • Help set up cloud infrastructure and deployment pipelines
  • Collaborate with our AI team to integrate machine learning models
  • Write clean, tested, production-ready code


You'll fit right in if:

  • You have 1 year of hands-on Python development experience
  • You're comfortable with full-stack development and cloud services
  • You write clean, maintainable code and follow good engineering practices
  • You're curious about AI/ML and eager to learn new technologies
  • You enjoy fast-paced startup environments and take ownership of your work


How we will set you up for success

  • You will work closely with the Founding team to understand what we are building.
  • You will be given comprehensive training about the tech stack, with an opportunity to avail virtual training as well.
  • You will be involved in a monthly one-on-one with the founders to discuss feedback
  • A unique opportunity to learn from the best - we are Gold partners of AWS, Razorpay, and Microsoft Startup programs, having access to rich talent to discuss and brainstorm ideas.
  • You’ll have a lot of creative freedom to execute new ideas. As long as you can convince us, and you’re confident in your skills, we’re here to back you in your execution.


Location: Bangalore, Remote


Compensation: Competitive salary + Meaningful equity


If you get excited about solving hard problems that have real-world impact, we should talk.


All the best!!

Read less


Read more
CryptoXpress

at CryptoXpress

1 recruiter
Aishwarya Anantharaman
Posted by Aishwarya Anantharaman
Remote only
1 - 2 yrs
₹6L - ₹8L / yr
skill iconReact.js
skill iconReact Native
skill iconPython
skill iconFlask
skill iconNodeJS (Node.js)
+13 more

Company

Crypto made easy 🚀 

We are the bridge between your crypto world and everyday life; trade pairs, book flights and hotels, and purchase gift cards with your favourite currencies. All in one best-in-class digital experience. It's not rocket science.

🔗Apply link at the bottom of this post — don’t miss it!


Why Join?

By joining CryptoXpress, you'll be at the cutting edge of merging digital currency with real-world services and products. We offer a stimulating work environment where innovation and creativity are highly valued. This remote role provides the flexibility to work from any location, promoting a healthy work-life balance. We are dedicated to fostering growth and learning, offering ample opportunities for professional development in the rapidly expanding fields of AI, blockchain technology, cryptocurrency, digital marketing and e-commerce.


Role Description

We are seeking an Application Developer for a full-time remote position at CryptoXpress. In this role, you will be responsible for developing and maintaining state-of-the-art mobile and web applications that integrate seamlessly with our blockchain and API technologies. The ideal candidate will bring a passion for creating exceptional user experiences, a deep understanding of React Native and JavaScript, and experience in building responsive and scalable applications.


Job Requirements:


  • Exposure and hands-on experience in mobile application development.
  • Significant experience working with React web and mobile along with tools like Flux, Flow, Redux, etc.
  • In-depth knowledge of JavaScript, CSS, HTML, and functional programming.
  • Strong knowledge of React fundamentals, including Virtual DOM, component lifecycle, and component state.
  • Comprehensive understanding of the full mobile app development lifecycle, including prototyping.
  • Proficiency in type checking, unit testing, Typescript, PropTypes, and code debugging.
  • Experience working with REST APIs, document request models, offline storage, and third-party libraries.
  • Solid understanding of user interface design, responsive design, and web technologies.
  • Familiarity with React Native tools such as Jest, Enzyme, and ESLint.
  • Basic knowledge of blockchain technology.


Essential Skill Set:


  • React Native & ReactJS
  • Python (Flask)
  • Node.js, Next.js
  • Web3.js / Ethers.js integration experience
  • MongoDB, Strapi, Firebase
  • API design and integration
  • In-app analytics / messaging tools (e.g., Firebase Messaging)
  • Wallet integrations or crypto payment gateways



How to Apply:

Interested candidates must complete the application form at 

https://forms.gle/J1giXJeg993fZViX6


Join us and help shape the future of social media marketing in the cryptocurrency space!

💡Pro Tip: Tips for Application Success

  • Show your enthusiasm for crypto, travel, and digital innovation
  • Mention any self-learning initiatives or personal crypto experiments
  • Be honest about what you don’t know — we value growth mindsets
  • Explore CryptoXpress before applying — take 2 minutes to download and try the app so you understand what we’re building


Read more
Hiring for MNC

Hiring for MNC

Agency job
via Spes Manning Solution by srushti patil
Remote only
5 - 10 yrs
₹18L - ₹20L / yr
skill iconScala
Akka
Spark
skill iconPython

Job Description:


Interviews will be scheduled in two days. 


We are seeking a highly skilled Scala Developer to join our team on an immediate basis. The ideal candidate will work remotely and collaborate with a US-based client, so excellent communication skills are essential.


Key Responsibilities:


Develop scalable and high-performance applications using Scala.


Collaborate with cross-functional teams to understand requirements and deliver quality solutions.


Write clean, maintainable, and testable code.


Optimize application performance and troubleshoot issues.


Participate in code reviews and ensure adherence to best practices.


Required Skills:


Strong experience in Scala development.


Solid understanding of functional programming principles.


Experience with frameworks like Akka, Play, or Spark is a plus.


Good knowledge of REST APIs, microservices architecture, and concurrency.


Familiarity with CI/CD, Git, and Agile methodologies.


Roles & Responsibilities


  • Develop and maintain scalable backend services using Scala.
  • Design and integrate RESTful APIs and microservices.
  • Collaborate with cross-functional teams to deliver technical solutions.
  • Write clean, efficient, and testable code.
  • Participate in code reviews and ensure code quality.
  • Troubleshoot issues and optimize performance.
  • Stay updated on Scala and backend development best practices.



Immediate joiner prefer.

Read more
Zazmic
Remote only
9 - 12 yrs
₹10L - ₹15L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
CI/CD
+5 more

Title: Senior Software Engineer – Python (Remote: Africa, India, Portugal)


Experience: 9 to 12 Years


INR : 40 LPA - 50 LPA


Location Requirement: Candidates must be based in Africa, India, or Portugal. Applicants outside these regions will not be considered.


Must-Have Qualifications:

  • 8+ years in software development with expertise in Python
  • kubernetes is important
  • Strong understanding of async frameworks (e.g., asyncio)
  • Experience with FastAPI, Flask, or Django for microservices
  • Proficiency with Docker and Kubernetes/AWS ECS
  • Familiarity with AWS, Azure, or GCP and IaC tools (CDK, Terraform)
  • Knowledge of SQL and NoSQL databases (PostgreSQL, Cassandra, DynamoDB)
  • Exposure to GenAI tools and LLM APIs (e.g., LangChain)
  • CI/CD and DevOps best practices
  • Strong communication and mentorship skills


Read more
 Zazmic Inc

Zazmic Inc

Agency job
Remote only
5 - 8 yrs
₹10L - ₹15L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
databricks
skill iconPython
SQL
+4 more

Title: Data Engineer II (Remote – India/Portugal)

Exp: 4- 8 Years

CTC: up to 30 LPA


Required Skills & Experience:

  • 4+ years in data engineering or backend software development
  • AI / ML is important
  • Expert in SQL and data modeling
  • Strong Python, Java, or Scala coding skills
  • Experience with Snowflake, Databricks, AWS (S3, Lambda)
  • Background in relational and NoSQL databases (e.g., Postgres)
  • Familiar with Linux shell and systems administration
  • Solid grasp of data warehouse concepts and real-time processing
  • Excellent troubleshooting, documentation, and QA mindset


If interested, kindly share your updated CV to 82008 31681

Read more
 France based AI -tech startup

France based AI -tech startup

Agency job
via Recruit Square by Priyanka choudhary
Remote, Bengaluru (Bangalore)
5 - 9 yrs
₹17L - ₹30L / yr
skill iconPython
skill iconNodeJS (Node.js)
skill iconMongoDB
Firebase
Google Cloud Platform (GCP)
+1 more

As a Senior Backend & Infrastructure Engineer, you will take ownership of backend systems and cloud infrastructure. You’ll work closely with our CTO and cross-functional teams (hardware, AI, frontend) to design scalable, fault- tolerant architectures and ensure reliable deployment pipelines.


  1. What You’ll Do :
  • Backend Development: Maintain and evolve our Node.js (TypeScript) and Python backend services with a focus on performance and scalability.
  • Cloud Infrastructure: Manage our infrastructure on GCP and Firebase (Auth, Firestore, Storage, Functions, AppEngine, PubSub, Cloud Tasks). 
  • Database Management: Handle Firestore and other NoSQL DBs. Lead database schema design and migration strategies.
  • Pipelines & Automation: Build robust real-time and batch data pipelines. Automate CI/CD and testing for backend and frontend services.
  • Monitoring & Uptime: Deploy tools for observability (logging, alerts, debugging). Ensure 99.9% uptime of critical services.
  • Dev Environments: Set up and manage developer and staging environments across teams.
  • Quality & Security: Drive code reviews, implement backend best practices, and enforce security standards.
  • Collaboration: Partner with other engineers (AI, frontend, hardware) to integrate backend capabilities seamlessly into our global system.


Must-Haves :

  • 5+ years of experience in backend development and cloud infrastructure.
  • Strong expertise in Node.js (TypeScript) and/or Python.
  • Advanced skills in NoSQL databases (Firestore, MongoDB, DynamoDB...).
  • Deep understanding of cloud platforms, preferably GCP and Firebase.
  • Hands-on experience with CI/CD, DevOps tools, and automation.
  • Solid knowledge of distributed systems and performance tuning.
  • Experience setting up and managing development & staging environments. 

• Proficiency in English and remote communication.


Good to have :

  • Event-driven architecture experience (e.g., Pub/Sub, MQTT).
  • Familiarity with observability tools (Prometheus, Grafana, Google Monitoring).
  • Previous work on large-scale SaaS products.
  • Knowledge of telecommunication protocols (MQTT, WebSockets, SNMP).
  • Experience with edge computing on Nvidia Jetson devices.


What We Offer :

  • Competitive salary for the Indian market (depending on experience).
  • Remote-first culture with async-friendly communication.
  • Autonomy and responsibility from day one.
  • A modern stack and a fast-moving team working on cutting-edge AI and cloud infrastructure.
  • A mission-driven company tackling real-world environmental challenges. 


Read more
Certa

at Certa

1 video
9 recruiters
Gyan S
Posted by Gyan S
Remote only
3 - 5 yrs
Best in industry
AWS Lambda
skill iconKubernetes
Terraform
skill iconAmazon Web Services (AWS)
skill iconPython
+7 more

Location: Remote (India only)

About Certa At Certa, we’re revolutionizing process automation for top-tier companies, including Fortune 500 and Fortune 1000 leaders, from the heart of Silicon Valley. Our mission? Simplifying complexity through cutting-edge SaaS solutions. Join our thriving, global team and become a key player in a startup environment that champions innovation, continuous learning, and unlimited growth. We offer a fully remote, flexible workspace that empowers you to excel.


Role Overview

Ready to elevate your DevOps career by shaping the backbone of a fast-growing SaaS platform? As a Senior DevOps Engineer at Certa, you’ll lead the charge in building, automating, and optimizing our cloud infrastructure. Beyond infrastructure management, you’ll actively contribute with a product-focused mindset, understanding customer requirements, collaborating closely with product and engineering teams, and ensuring our AWS-based platform consistently meets user needs and business goals.


What You’ll Do

  • Own SaaS Infrastructure: Design, architect, and maintain robust, scalable AWS infrastructure, enhancing platform stability, security, and performance.
  • Orchestrate with Kubernetes: Utilize your advanced Kubernetes expertise to manage and scale containerized deployments efficiently and reliably.
  • Collaborate on Enterprise Architecture: Align infrastructure strategies with enterprise architectural standards, partnering closely with architects to build integrated solutions.
  • Drive Observability: Implement and evolve sophisticated monitoring and observability solutions (DataDog, ELK Stack, AWS CloudWatch) to proactively detect, troubleshoot, and resolve system anomalies.
  • Lead Automation Initiatives: Champion an automation-first mindset across the organization, streamlining development, deployment, and operational workflows.
  • Implement Infrastructure as Code (IaC): Master Terraform to build repeatable, maintainable cloud infrastructure automation.
  • Optimize CI/CD Pipelines: Refine and manage continuous integration and deployment processes (currently GitHub Actions, transitioning to CircleCI), enhancing efficiency and reliability.
  • Enable GitOps with ArgoCD: Deliver seamless GitOps-driven application deployments, ensuring accuracy and consistency in Kubernetes environments.
  • Advocate for Best Practices: Continuously promote and enforce industry-standard DevOps practices, ensuring consistent, secure, and efficient operational outcomes.
  • Innovate and Improve: Constantly evaluate and enhance current DevOps processes, tooling, and methodologies to maintain cutting-edge efficiency.
  • Product Mindset: Actively engage with product and engineering teams, bringing infrastructure expertise to product discussions, understanding customer needs, and helping prioritize infrastructure improvements that directly benefit users and business objectives.


What You Bring

  1. Hands-On Experience: 3-5 years in DevOps roles, ideally within fast-paced SaaS environments.
  2. Kubernetes Mastery: Advanced knowledge and practical experience managing Kubernetes clusters and container orchestration.
  3. AWS Excellence: Comprehensive expertise across AWS services, infrastructure management, and security.
  4. IaC Competence: Demonstrated skill in Terraform for infrastructure automation and management.
  5. CI/CD Acumen: Proven proficiency managing pipelines with GitHub Actions; familiarity with CircleCI highly advantageous.
  6. GitOps Knowledge: Experience with ArgoCD for effective continuous deployment and operations.
  7. Observability Skills: Strong capabilities deploying and managing monitoring solutions such as DataDog, ELK, and AWS CloudWatch.
  8. Python Automation: Solid scripting and automation skills using Python.
  9. Architectural Awareness: Understanding of enterprise architecture frameworks and alignment practices.
  10. Proactive Problem-Solving: Exceptional analytical and troubleshooting skills, adept at swiftly addressing complex technical challenges.
  11. Effective Communication: Strong interpersonal and collaborative skills, essential for remote, distributed teamwork.
  12. Product Focus: Ability and willingness to understand customer requirements, prioritize tasks that enhance product value, and proactively suggest infrastructure improvements driven by user needs.
  13. Startup Mindset (Bonus): Prior experience or enthusiasm for dynamic startup cultures is a distinct advantage.


Why Join Us

  • Compensation: Top-tier salary and exceptional benefits.
  • Work-Life Flexibility: Fully remote, flexible scheduling.
  • Growth Opportunities: Accelerate your career in a company poised for significant growth.
  • Innovative Culture: Engineering-centric, innovation-driven work environment.
  • Team Events: Annual offsites and quarterly Hackerhouse.
  • Wellness & Family: Comprehensive healthcare and parental leave.
  • Workspace: Premium workstation setup allowance, providing the tech you need to succeed.
Read more
Awign Enterprises

at Awign Enterprises

3 recruiters
Pramit Puranik
Posted by Pramit Puranik
Remote only
3 - 6 yrs
₹12L - ₹16L / yr
skill iconGo Programming (Golang)
skill iconJava
skill iconPython
Linux/Unix

Duration: 6 months with possible extension

Location: Remote 

Notice Period: Immediate Joiner Preferred

Experience: 4-6 Years


Requirements:

  • B Tech/M Tech in Computer Science or equivalent from a reputed college with a minimum of 4 – 6 years of experience in a Product Development Company
  • Sound knowledge and application of algorithms and data structures with space and me complexities
  • Strong design skills involving data modeling and low-level class design
  • Good knowledge of object-oriented programming and design patterns
  • Proficiency in Python, Java, and Golang
  • Follow industry coding standards and be responsible for writing maintainable/scalable/efficient code to solve business problems
  • Hands-on experience of working with Databases and the Linux/Unix platform
  • Follow SDLC in an agile environment and collaborate with multiple cross-functional teams to drive deliveries
  • Strong technical aptitude and good knowledge of CS fundamentals


What will you get to do here?

  • Coming up with best practices to help the team achieve their technical tasks and continually thrive in improving the technology of the product/team.
  • Driving the adoption of best practices & regular Participation in code reviews, design reviews, and architecture discussions.
  • Experiment with new & relevant technologies and tools, and drive adoption while measuring yourself on the impact you can create.
  • Implementation of long-term technology vision for your team.
  • Creating architectures & designs for new solutions around existing/new areas
  • Decide on technology & tool choices for your team & be responsible for them.
Read more
MyOperator - VoiceTree Technologies

at MyOperator - VoiceTree Technologies

1 video
2 recruiters
Vijay Muthu
Posted by Vijay Muthu
Remote only
8 - 12 yrs
₹14L - ₹16L / yr
Test Automation (QA)
Automated testing
Selenium
skill iconJava
API
+10 more

About MyOperator:

MyOperator is India's leading cloud communications provider, offering cutting-edge solutions to over 10,000 businesses across diverse industries. From Cloud Call Center solutions to IVR, Toll-free Numbers, Enterprise Mobility, WhatsApp Business Solutions, and Heyo Phone, we provide comprehensive SaaS platforms backed by exceptional customer service.

Job Summary: We are looking for a skilled Automation QA lead to join our team. We are looking for 8 to 14 years of QA and automation experience with skills required: JavaScript, Node.js, Selenium,Perl/Python. We want people who: work well in teams, think out of the box, work in new tech areas and thrive in ambiguity.


Requirements

  • Automate with various technologies (e.g. Selenium Webdriver, Python) tools (e.g. Jenkins, Kubernetes), solutions and processes to support scalable and repeatable practices.
  • Collaborate in all aspects of the automation development process from requirement gathering through shift left develop-and-test cycles.  
  • Design web pages using these components, customize and validate if upgradable and extensible
  • Supports continuous improvement processes, analyzing problems and recommending actions for effective resolution.
  • Help coordinate technical leadership within Architecture, Development, QA, Operations and Release Management teams to enable effective automation.
  • Enforce software engineering standard methodologies and work with the Engineering Management team to forecast, plan and drive team efficiency.
  • Influences and cultivates innovation within engineering groups.
  • Mentor fellow Quality Engineers, evangelizing standard methodologies of automation and technical processes


Benefits

  • 8 to 14 years’ experience in functional QA and automation of web-based and mobile based applications
  • Should have experience in mobile automation (iOS, Android)
  • Hands-on experience with Automation test strategy, test planning, script development and execution. 
  • Experience in performance testing is a plus. 
  • Exposure to SDLC practices of Scrum Agile.
  • Excellent verbal and written communication skills.
  • BS/MS in Computer Science, Computer Engineering or equivalent domain
  • Ability to adjust to competing priorities and allocate your time as vital to getting the job done


Read more
Pattern Agentix

at Pattern Agentix

2 candid answers
jaime benchimol
Posted by jaime benchimol
Remote only
1 - 10 yrs
₹8L - ₹13L / yr
Retrieval Augmented Generation (RAG)
skill iconPython
Generative AI
Large Language Models (LLM) tuning
Prompt engineering

Pattern Agentix (patternagentix.com) is seeking a computational bilogist to assume the role of Lead AI researcher to create advanced multi-agent AI systems leveraging cutting-edge AI research and Retrieval-Augmented Generation (RAG) techniques. The ideal candidate should have a strong academic and research background in AI, demonstrated through published research papers, open-source contributions (e.g., GitHub).


Exposure to bioinformatics or some background in bioinformatics is a compulsory requirement. The candiate will apply computational techniques, mathematical models, and computer science skills to analyze and interpret complex biological data.



Required Skills & Experience


Master’s or Ph.D. in AI, Machine Learning, Computer Science, or a related field with some exposure to bioinformatics. 


Strong AI research background, demonstrated through peer-reviewed publications in top-tier AI/ML conferences or journals (e.g., NeurIPS, ICML, AAAI, CVPR, ACL, etc.).


Proficiency in Python and experience with AI/ML frameworks (e.g., PyTorch, TensorFlow).


Experience with multi-agent AI systems and their architectural design.



Project Scope


The project involves developing a sophisticated multi-agent system that:

that automates hypothesis generation in biomedical research using large bimedical research data sets..


We are open on compensation models but compensation will be aligned to local norms. We would consider part time or full time.


 


 

Read more
Remote only
2 - 5 yrs
₹5L - ₹8L / yr
skill iconPython
NumPy
PyTorch
pandas
Data Visualization
+5 more

Python Developer

We are looking for an enthusiastic and skilled Python Developer with a passion for AI-based application development to join our growing technology team. This position offers the opportunity to work at the intersection of software engineering and data analytics, contributing to innovative AIdriven solutions that drive business impact. If you have a strong foundation in Python, a flair for problem-solving, and an eagerness to build intelligent systems, we would love to meet you!

Key Responsibilities

• Develop and deploy AI-focused applications using Python and associated frameworks.

• Collaborate with Developers, Product Owners, and Business Analysts to design and implement machine learning pipelines.

• Create interactive dashboards and data visualizations for actionable insights.

• Automate data collection, transformation, and processing tasks.

• Utilize SQL for data extraction, manipulation, and database management.

• Apply statistical methods and algorithms to derive insights from large datasets.

Required Skills and Qualifications

• 2–3 years of experience as a Python Developer, with a strong portfolio of relevant projects.

• Bachelor’s degree in Computer Science, Data Science, or a related technical field.

• In-depth knowledge of Python, including frameworks and libraries such as NumPy, Pandas, SciPy, and PyTorch.

• Proficiency in front-end technologies like HTML, CSS, and JavaScript.

• Familiarity with SQL and NoSQL databases and their best practices.

• Excellent communication and team-building skills.

• Strong problem-solving abilities with a focus on innovation and self-learning.

• Knowledge of cloud platforms such as AWS is a plus.

Additional Requirements This opportunity enhances your work life balance with allowance for remote work.


To be successful your computer hardware and internet must meet these minimum requirements:

1. Laptop or Desktop: • Operating System: Windows • Screen Size: 14 Inches • Screen Resolution: FHD (1920×1080) • Processor: I5 or higher • RAM: Minimum 8GB (Must) • Type: Windows Laptop • Software: AnyDesk • Internet Speed: 100 MBPS or higher


About ARDEM

ARDEM is a leading Business Process Outsourcing and Business Process Automation Service provider. For over twenty years ARDEM has successfully delivered business process outsourcing and business process automation services to our clients in USA and Canada. We are growing rapidly. We are constantly innovating to become a better service provider for our customers. We continuously strive for excellence to become the Best Business Process Outsourcing and Business Process Automation company

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Remote only
5 - 7 yrs
₹12L - ₹16L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
PySpark
Data Transformation Tool (DBT)
+2 more

Role: GCP Data Engineer

Notice Period: Immediate Joiners

Experience: 5+ years

Location: Remote

Company: Deqode


About Deqode

At Deqode, we work with next-gen technologies to help businesses solve complex data challenges. Our collaborative teams build reliable, scalable systems that power smarter decisions and real-time analytics.


Key Responsibilities

  • Build and maintain scalable, automated data pipelines using Python, PySpark, and SQL.
  • Work on cloud-native data infrastructure using Google Cloud Platform (BigQuery, Cloud Storage, Dataflow).
  • Implement clean, reusable transformations using DBT and Databricks.
  • Design and schedule workflows using Apache Airflow.
  • Collaborate with data scientists and analysts to ensure downstream data usability.
  • Optimize pipelines and systems for performance and cost-efficiency.
  • Follow best software engineering practices: version control, unit testing, code reviews, CI/CD.
  • Manage and troubleshoot data workflows in Linux environments.
  • Apply data governance and access control via Unity Catalog or similar tools.


Required Skills & Experience

  • Strong hands-on experience with PySpark, Spark SQL, and Databricks.
  • Solid understanding of GCP services (BigQuery, Cloud Functions, Dataflow, Cloud Storage).
  • Proficiency in Python for scripting and automation.
  • Expertise in SQL and data modeling.
  • Experience with DBT for data transformations.
  • Working knowledge of Airflow for workflow orchestration.
  • Comfortable with Linux-based systems for deployment and troubleshooting.
  • Familiar with Git for version control and collaborative development.
  • Understanding of data pipeline optimization, monitoring, and debugging.
Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
8 - 12 yrs
Best in industry
skill iconPython
skill icon.NET
Apache Airflow
skill iconReact.js
skill iconJavascript
+4 more

Title - Pncpl Software Engineer

Company Summary :

As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

Business Summary :

The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

Principal Software Engineer

Position Responsibilities :

  • Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
  • Develop scalable, performant APIs for Deltek products
  • Accountability for the successful implementation of the requirements by the team.
  • Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
  • Undertake analysis, design, coding and testing activities of complex modules
  • Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
  • Participate in code reviews and provide mentorship to junior developers.
  • Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React. And suggest optimisations based on them
  • Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
  • Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
  • Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.

Qualifications :

  • A college degree in Computer Science, Software Engineering, Information Science or a related field is required 
  • Minimum 8-10 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
  • Experience in backend development and Apache Airflow (or equivalent framework).
  • Build APIs and optimize SQL queries with performance considerations.
  • Experience with Agile Development
  • Experience in writing and maintaining unit tests and using testing frameworks is desirable
  • Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
  • Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
  • The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
  • Strong problem-solving and debugging skills.
  • Ability to work in an Agile environment and collaborate with cross-functional teams.
  • Familiarity with version control systems like Git.
  • Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.

Read more
Deltek
shwetha V
Posted by shwetha V
Remote only
4 - 7 yrs
Best in industry
skill iconPython
skill icon.NET
skill iconJava
Apache Airflow
TypeScript
+6 more

Title - Sr Software Engineer

Company Summary :


As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com


Business Summary :


The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

External Job Title :


Sr Software Engineer

Position Responsibilities :


  • Develop and manage integrations with third-party services and APIs using industry-standard protocols like OAuth2 for secure authentication and authorization.
  • Develop scalable, performant APIs for Deltek products
  • Accountability for the successful implementation of the requirements by the team.
  • Troubleshoot, debug, and optimize code and workflows for better performance and scalability.
  • Undertake analysis, design, coding and testing activities of complex modules
  • Support the company’s development processes and development guidelines including code reviews, coding style and unit testing requirements.
  • Participate in code reviews and provide mentorship to junior developers.
  • Stay up-to-date with emerging technologies and best practices in Python development, AWS, and frontend frameworks like React.
  • Adopt industry best practices in all your projects - TDD, CI/CD, Infrastructure as Code, linting
  • Pragmatic enough to deliver an MVP, but aspirational enough to think about how it will work with millions of users and adapt to new challenges
  • Readiness to hit the ground running – you may not know how to solve everything right off the bat, but you will put in the time and effort to understand so that you can design architecture of complex features with multiple components.


Qualifications :


  • A college degree in Computer Science, Software Engineering, Information Science or a related field is required 
  • Minimum 4-6 years of experience Sound programming skills on Python, .Net platform (VB & C#), TypeScript / JavaScript, Frontend technologies like React.js/Ember.js, SQL Db (like PostgreSQL)
  • Experience in backend development and Apache Airflow (or equivalent framework).
  • Build APIs and optimize SQL queries with performance considerations.
  • Experience with Agile Development
  • Experience in writing and maintaining unit tests and using testing frameworks is desirable
  • Exposure to Amazon Web Services (AWS) technologies, Terraform, Docker is a plus
  • Strong desire to continually improve knowledge and skills through personal development activities and apply their knowledge and skills to continuous software improvement.
  • The ability to work under tight deadlines, tolerate ambiguity and work effectively in an environment with multiple competing priorities.
  • Strong problem-solving and debugging skills.
  • Ability to work in an Agile environment and collaborate with cross-functional teams.
  • Familiarity with version control systems like Git.
  • Excellent communication skills and the ability to work effectively in a remote or hybrid team setting.
Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via Devseekerz by Sakthi Ganesh
Remote only
4 - 12 yrs
₹22L - ₹36L / yr
skill iconPython
skill iconData Analytics
skill iconData Science
skill iconMachine Learning (ML)

● Candidate should have Hands-on development experience as Data Analyst and/or ML Engineer.

● Candidate must have Coding experience in Python.

● Candidate should have Good Experience with ML models and ML algorithms.

● Need Experience with statistical modelling of large data sets.

● Looking for Immediate joiners or max. 30 days of Notice Period candidates.

● The candidates based out of these locations - Bangalore, Pune, Hyderabad, Mumbai, will be preffered.


What You will do:

● Play the role of Data Analyst / ML Engineer

● Collection, cleanup, exploration and visualization of data

● Perform statistical analysis on data and build ML models

● Implement ML models using some of the popular ML algorithms

● Use Excel to perform analytics on large amounts of data

● Understand, model and build to bring actionable business intelligence out of data that is available in different formats

● Work with data engineers to design, build, test and monitor data pipelines for ongoing business operations

 

Basic Qualifications:

● Experience: 4+ years.

● Hands-on development experience playing the role of Data Analyst and/or ML Engineer.

● Experience in working with excel for data analytics

● Experience with statistical modelling of large data sets

● Experience with ML models and ML algorithms

● Coding experience in Python

 

Nice to have Qualifications:

● Experience with wide variety of tools used in ML

● Experience with Deep learning

 

Benefits:

● Competitive salary.

● Hybrid work model.

● Learning and gaining experience rapidly.

● Reimbursement for basic working set up at home.

● Insurance (including a top up insurance for COVID).

Read more
Client Located in Bangalore Location

Client Located in Bangalore Location

Agency job
Remote only
4 - 12 yrs
₹30L - ₹60L / yr
Large Language Models (LLM)
skill iconDeep Learning
skill iconMachine Learning (ML)
skill iconPython
Healthcare
+2 more

Experience:

  • Junior Level: 4+ years
  • Senior Level: 8+ years

Work Mode:Remote

About the Role:

We are seeking a highly skilled and motivated Data Scientist with deep expertise in Machine Learning (ML), Deep Learning, and Large Language Models (LLMs) to join our forward-thinking AI & Data Science team. This is a unique opportunity to contribute to real-world impact in the healthcare industry, transforming the way patients and providers interact with health data through Generative AI and NLP-driven solutions.

Key Responsibilities:

  • LLM Development & Fine-Tuning:
  • Fine-tune and customize LLMs (e.g., GPT, LLaMA2, Mistral) for use cases such as text classification, NER, summarization, Q&A, and sentiment analysis.
  • Experience with other transformer-based models (e.g., BERT) is a plus.
  • Data Engineering & Pipeline Design:
  • Collaborate with data engineering teams to build scalable, high-quality data pipelines for training/fine-tuning LLMs on structured and unstructured healthcare datasets.
  • Experimentation & Evaluation:
  • Design rigorous model evaluation and testing frameworks (e.g., with tools like TruLens) to assess performance and optimize model outcomes.
  • Deployment & MLOps Integration:
  • Work closely with MLOps teams to ensure seamless integration of models into production environments on cloud platforms (AWS, Azure, GCP).
  • Predictive Modeling in Healthcare:
  • Apply ML/LLM techniques to build predictive models for use cases in oncology (e.g., survival analysis, risk prediction, RWE generation).
  • Cross-functional Collaboration:
  • Engage with domain experts, product managers, and clinical teams to translate healthcare challenges into actionable AI solutions.
  • Mentorship & Knowledge Sharing:
  • Mentor junior team members and contribute to the growth of the team’s technical expertise.

Qualifications:

  • Master’s or Doctoral degree in Computer Science, Data Science, Artificial Intelligence, or related field.
  • 5+ years of hands-on experience in machine learning and deep learning, with at least 12 months of direct work on LLMs.
  • Strong coding skills in Python, with experience in libraries like HuggingFace Transformers, spaCy, NLTK, TensorFlow, or PyTorch.
  • Experience with prompt engineering, RAG pipelines, and evaluation techniques in real-world NLP deployments.
  • Hands-on experience in deploying models on cloud platforms (AWS, Azure, or GCP).
  • Familiarity with the healthcare domain and working on Real World Evidence (RWE) datasets is highly desirable.

Preferred Skills:

  • Strong understanding of healthcare data regulations (HIPAA, PHI handling, etc.)
  • Prior experience in speech and text-based AI applications
  • Excellent communication and stakeholder engagement skills
  • A passion for impactful innovation in the healthcare space


Read more
Client based at Bangalore location.

Client based at Bangalore location.

Agency job
Remote only
8 - 12 yrs
₹24L - ₹30L / yr
Real World evidence
RWE Analyst
Healthcare
Large Language Models (LLM)
SQL
+10 more

Real-World Evidence (RWE) Analyst

Summary:

As an experienced Real-World Evidence (RWE) Analyst, you will leverage our cutting-edge healthcare data platform (accessing over 60 million lives in Asia, with ambitious growth plans across Africa and the Middle East) to deliver impactful clinical insights to our pharmaceutical clients. You will be involved in the full project lifecycle, from designing analyses to execution and delivery, within our agile data science team. This is an exciting opportunity to contribute significantly to a growing early-stage company focused on improving precision medicine and optimizing patient care for diverse populations.

Responsibilities:

·      Contribute to the design and execution of retrospective and prospective real-world research, including epidemiological and patient outcomes studies.

·      Actively participate in problem-solving discussions by clearly defining issues and proposing effective solutions.

·      Manage the day-to-day progress of assigned workstreams, ensuring seamless collaboration with the data engineering team on analytical requests.

·      Provide timely and clear updates on project status to management and leadership.

·      Conduct in-depth quantitative and qualitative analyses, driven by project objectives and your intellectual curiosity.

·      Ensure the quality and accuracy of analytical outputs, and contextualize findings by reviewing relevant published research.

·      Synthesize complex findings into clear and compelling presentations and written reports (e.g., slides, documents).

·      Contribute to the development of standards and best practices for future RWE analyses.

Requirements:

·      Undergraduate or post-graduate degree (MS or PhD preferred) in a quantitative analytical discipline such as Epidemiology, (Bio)statistics, Data Science, Engineering, Econometrics, or Operations Research.

·      8+ years of relevant work experience demonstrating:

o  Strong analytical and problem-solving capabilities.

o  Experience conducting research relevant to the pharmaceutical/biotech industry.

·      Proficiency in technical skills including SQL and at least one programming language (R, Python, or similar).

·      Solid understanding of the healthcare/medical and pharmaceutical industries.

·      Proven experience in managing workstream or project management activities.

·      Excellent written and verbal communication, and strong interpersonal skills with the ability to build collaborative partnerships.

·      Exceptional attention to detail.

·      Proficiency in Microsoft Office Suite (Excel, PowerPoint, Word).

Other Desirable Skills:

·      Demonstrated dedication to teamwork and the ability to collaborate effectively across different functions.

·      A strong desire to contribute to the growth and development of the RWE analytics function.

·      A proactive and innovative mindset with an entrepreneurial spirit, eager to take on a key role in a dynamic, growing company.

Read more
CD Edverse

at CD Edverse

2 candid answers
Ashish Yadav
Posted by Ashish Yadav
Remote only
1 - 5 yrs
₹10L - ₹15L / yr
skill iconPython
LangChain
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
AI Agents

Join CD Edverse, an innovative EdTech app, as AI Specialist! Develop a deep research tool to generate comprehensive courses and enhance AI mentors. Must have strong Python, NLP, and API integration skills. Be part of transforming education! Apply now.

Read more
Deltek
Puja Rana
Posted by Puja Rana
Remote only
4 - 6 yrs
₹12L - ₹18L / yr
skill iconPython

Position Responsibilities :

  • Work with product managers to understand the business workflows/requirements, identify needs gaps, and propose relevant technical solutions
  • Design, Implement & tune changes to the product that work within the time tracking/project management environment 
  • Be understanding and sensitive to customer requirements to be able to offer alternative solutions
  • Keep in pace with the product releases
  • Work within Deltek-Replicon's software development process, expectations and quality initiatives
  • Work to accurately evaluate risk and estimate software development tasks
  • Strive to continually improve technical and developmental skills

Qualifications :

  • Bachelor of Computer Science, Computer Engineering, or related field.
  • 4+ years of software development experience (Core: Python v2.7 or higher).
  • Strong Data structures, algorithm design, problem-solving, and Quantitative analysis skills.
  • Knowledge of how to use microservices and APIs in code.
  • TDD unit test framework knowledge (preferably Python).
  • Strong and well-versed with Git basic and advanced concepts and their respective commands and should be able to handle merge conflicts.
  • Must have basic knowledge of web development technologies and should have worked on any web development framework.
  • SQL queries working knowledge.
  • Basic operating knowledge in some kind of project management tool like Jira.
  • Good to have:- Knowledge of EmberJs, C#, and .Net framework.

Read more
Agivant Technologies

Agivant Technologies

Agency job
via Vidpro Consultancy Services by ashik thahir
Remote only
5 - 10 yrs
₹18L - ₹25L / yr
skill iconPython
SQL
Airflow
Snowflake
skill iconElastic Search
+3 more

Experience: 5-8 Years

Work Mode: Remote

Job Type: Fulltime

Mandatory Skills: Python,SQL, Snowflake, Airflow, ETL, Data Pipelines, Elastic Search, & AWS.


Role Overview:

We are looking for a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes.


Responsibilities:

  • Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness.
  • Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines.
  • Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS.
  • Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs.
  • Implement data quality checks and monitoring to ensure data integrity and identify potential issues.
  • Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes.
  • Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum and emerging technologies in data engineering.
  • Contribute to the development and enhancement of our data warehouse architecture

Required Skills:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes.
  • At least 3+ years of exp in Snowflake data warehousing technologies.
  • At least 3+ years of exp in creating and maintaining Airflow ETL pipelines.
  • Minimum 3+ years of professional level experience with Python languages for data manipulation and automation.
  • Working experience with Elastic Search and its application in data pipelines.
  • Proficiency in SQL and experience with data modelling techniques.
  • Strong understanding of cloud-based data storage solutions such as AWS S3.
  • Experience working with NFS and other file storage systems.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.


Read more
Incubyte

at Incubyte

4 recruiters
Sarika Shitole
Posted by Sarika Shitole
Remote only
3 - 8 yrs
Best in industry
skill iconPython
skill iconReact.js

About Us

We are a company where the ‘HOW’ of building software is just as important as the ‘WHAT.’ We partner with large organizations to modernize legacy codebases and collaborate with startups to launch MVPs, scale, or act as extensions of their teams. Guided by Software Craftsmanship values and eXtreme Programming Practices, we deliver high-quality, reliable software solutions tailored to our clients' needs.


We thrive to: 

  • Bring our clients' dreams to life by being their trusted engineering partners, crafting innovative software solutions.
  • Challenge offshore development stereotypes by delivering exceptional quality, and proving the value of craftsmanship.
  • Empower clients to deliver value quickly and frequently to their end users.
  • Ensure long-term success for our clients by building reliable, sustainable, and impactful solutions.
  • Raise the bar of software craft by setting a new standard for the community.

Job Description

This is a remote position.

Our Core Values


  • Quality with Pragmatism: We aim for excellence with a focus on practical solutions.  
  • Extreme Ownership: We own our work and its outcomes fully.  
  • Proactive Collaboration: Teamwork elevates us all.  
  • Pursuit of Mastery: Continuous growth drives us.  
  • Effective Feedback: Honest, constructive feedback fosters improvement.  
  • Client Success: Our clients’ success is our success. 


Experience Level


This role is ideal for engineers with 3+ years of hands-on software development experience, particularly in ​Python and ReactJs at scale. 


Role Overview

If you’re a Software Craftsperson who takes pride in clean, test-driven code and believes in Extreme Programming principles, we’d love to meet you. At Incubyte, we’re a DevOps organization where developers own the entire release cycle, meaning you’ll get hands-on experience across programming, cloud infrastructure, client communication, and everything in between. Ready to level up your craft and join a team that’s as quality-obsessed as you are? Read on!   


What You'll Do

  • Write Tests First: Start by writing tests to ensure code quality 
  • Clean Code: Produce self-explanatory, clean code with predictable results 
  • Frequent Releases: Make frequent, small releases 
  • Pair Programming: Work in pairs for better results 
  • Peer Reviews: Conduct peer code reviews for continuous improvement 
  • Product Team: Collaborate in a product team to build and rapidly roll out new features and fixes 
  • Full Stack Ownership: Handle everything from the front end to the back end, including infrastructure and DevOps pipelines 
  • Never Stop Learning: Commit to continuous learning and improvement  





Requirements

What We're Looking For

  • Proficiency in some or all of the following: ReactJS,  JavaScript, Object Oriented Programming in JS
  • 3+ years of Object-Oriented Programming with Python or equivalent
  • 3+ years of experience working with relational (SQL) databases
  • 3+ years of experience using Git to contribute code as part of a team of Software Craftspeople




Benefits

What We Offer

  • Dedicated Learning & Development Budget: Fuel your growth with a budget dedicated solely to learning.
  • Conference Talks Sponsorship: Amplify your voice! If you’re speaking at a conference, we’ll fully sponsor and support your talk.
  • Cutting-Edge Projects: Work on exciting projects with the latest AI technologies
  • Employee-Friendly Leave Policy: Recharge with ample leave options designed for a healthy work-life balance.
  • Comprehensive Medical & Term Insurance: Full coverage for you and your family’s peace of mind.
  • And More: Extra perks to support your well-being and professional growth.

Work Environment 

  • Remote-First Culture: At Incubyte, we thrive on a culture of structured flexibility — while you have control over where and how you work, everyone commits to a consistent rhythm that supports their team during core working hours for smooth collaboration and timely project delivery. By striking the perfect balance between freedom and responsibility, we enable ourselves to deliver high-quality standards our customers recognize us by. With asynchronous tools and push for active participation, we foster a vibrant, hands-on environment where each team member’s engagement and contributions drive impactful results.
  • Work-In-Person: Twice a year, we come together for two-week sprints to collaborate in person, foster stronger team bonds, and align on goals. Additionally, we host an annual retreat to recharge and connect as a team. All travel expenses are covered.
  • Proactive Collaboration: Collaboration is central to our work. Through daily pair programming sessions, we focus on mentorship, continuous learning, and shared problem-solving. This hands-on approach keeps us innovative and aligned as a team.


Read more
OIP Insurtech

at OIP Insurtech

2 candid answers
Katarina Vasic
Posted by Katarina Vasic
Remote, Hyderabad
4 - 10 yrs
₹30L - ₹50L / yr
skill iconPython
Data extraction
Natural Language Processing (NLP)
TensorFlow
Large Language Models (LLM)
+1 more

What We’re Looking For


Proven experience as a Machine Learning Engineer, Data Scientist, or similar role


Expertise in applying machine learning algorithms, deep learning, and data mining techniques in an enterprise environment


Strong proficiency in Python (or other languages) and familiarity with libraries such as Scikit-learn, TensorFlow, PyTorch, or similar.


Experience working with natural language processing (NLP) or computer vision is highly desirable.


Understanding and experience with (MLOps), including model development, deployment, monitoring, and maintenance.


Experience with cloud platforms (like AWS, Google Cloud, or Azure) and knowledge of deploying machine learning models at scale.


Familiarity with data architecture, data engineering, and data pipeline tools.


Familiarity with containerization technologies such as Docker, and orchestration systems like Kubernetes.


Knowledge of the insurance sector is beneficial but not required.


Bachelor's/Master's degree in Computer Science, Data Science, Mathematics, or a related field.


What You’ll Be Doing

Algorithm Development:

Design and implement advanced machine learning algorithms tailored for our datasets.


Model Creation:

Build, train, and refine machine learning models for business integration.


Collaboration:

Partner with product managers, developers, and data scientists to align machine learning solutions with business goals.


Industry Innovation:

Stay updated with Insurtech trends and ensure our solutions remain at the forefront.


Validation:

Test algorithms for accuracy and efficiency, collaborating with the QA team.


Documentation:

Maintain clear records of algorithms and models for team reference.


Professional Growth:

Engage in continuous learning and mentor junior team members.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
4 - 8 yrs
₹20L - ₹35L / yr
skill iconPython
AWS
Amazon EC2
skill iconPostgreSQL
Service company preferred

Mandatory (Experience 1) - Must have a minimum 4+ years of experience in backend software development.

Mandatory (Experience 2) -Must have 4+ years of experience in backend development using Python (Highly preferred), Java, or Node.js.

Mandatory (Experience 3) - Must have experience with Cloud platforms like AWS (highly preferred), gcp or azure

Mandatory (Experience 4) - Must have Experience in any databases - MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL / MongoDB / Ne

Read more
CLOUDSUFI

at CLOUDSUFI

3 recruiters
Ayushi Dwivedi
Posted by Ayushi Dwivedi
Remote only
6 - 13 yrs
₹35L - ₹45L / yr
Google Cloud Platform (GCP)
skill iconMachine Learning (ML)
Generative AI
skill iconPython
MLOps
+1 more

AI Architect


Location and Work Requirements

-      Position is based in KSA or UAE

-      Must be eligible to work abroad without restrictions

-      Regular travel within the region required


Key Responsibilities

-      Minimum 7+ years of experience in Data & Analytics domain and minimum 2 years as AI Architect

-      Drive technical solution design engagements and implementations

-      Support customer implementations across various deployment modes (Public SaaS, Single-Tenant SaaS, and Self-Managed Kubernetes)

-      Provide advanced technical support, including deployment troubleshooting and coordinating with customer AI Architect and product development teams when needed

-      Guide customers in implementing generative AI solutions, including LLM integration, vector database management, and prompt engineering

-      Coordinate and oversee platform installations and configuration work

-      Assist customers with platform integration, including API implementation and custom model deployment

-      Establish and promote best practices for AI governance and MLOps

-      Proactively identify and address potential technical challenges before they impact customer success


Required Technical Skills

-      Strong programming skills in Python with experience in data processing libraries (Pandas, NumPy)

-      Proficiency in SQL and experience with various database technologies including MongoDB

-      Container technologies: Docker (build, modify, deploy) and Kubernetes (kubectl, helm)

-      Version control systems (Git) and CI/CD practices

-      Strong networking fundamentals (TCP/IP, SSH, SSL/TLS)

-      Shell scripting (Linux/Unix environments)

-      Experience in working on on-prem, airgapped environments

-      Experience with cloud platforms (AWS, Azure, GCP)


Required AI/ML Skills

-      Deep expertise in both predictive machine learning and generative AI technologies

-      Proven experience implementing and operationalizing large language models (LLMs)

-      Strong knowledge of vector databases, embedding technologies, and similarity search concepts

-      Advanced understanding of prompt engineering, LLM evaluation, and AI governance methods

-      Practical experience with machine learning deployment and production operations

-      Understanding of AI safety considerations and risk mitigation strategies



Required Qualities

-      Excellent English communication skills with ability to explain complex technical concepts. Arabic language is advantageous.

-      Strong consultative approach to understanding and solving business problems

-      Proven ability to build trust through proactive customer engagement

-      Strong problem-solving abilities and attention to detail

-      Ability to work independently and as part of a distributed team

-      Willingness to travel within the Middle East & Africa region as needed 

Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Eman Khan
Posted by Eman Khan
Remote, Delhi
6 - 9 yrs
Best in industry
skill iconPython
skill iconJava
skill iconJavascript
Locust
Gatling
+14 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website - https://www.gohighlevel.com/

YouTube Channel - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post - https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

HighLevel Inc. is looking for a Lead SDET with 8-10 years of experience to play a pivotal role in ensuring the quality, performance, and scalability of our products. We are seeking engineers who thrive in a fast-paced startup environment, enjoy problem-solving, and stay updated with the latest models and solutions. This is an exciting opportunity to work on cutting-edge performance testing strategies and drive impactful initiatives across the organisation.


Responsibilities:

  • Implement performance, scalability, and reliability testing strategies
  • Capture and analyze key performance metrics to identify bottlenecks
  • Work closely with development, DevOps, and infrastructure teams to optimize system performance
  • Review application architecture and suggest improvements to enhance scalability
  • Leverage AI at appropriate layers to improve efficiency and drive positive business outcomes
  • Drive performance testing initiatives across the organization and ensure seamless execution
  • Automate the capturing of performance metrics and generate performance trend reports
  • Research, evaluate, and conduct PoCs for new tools and solutions
  • Collaborate with developers and architects to enhance frontend and API performance
  • Conduct root cause analysis of performance issues using logs and monitoring tools
  • Ensure high availability and reliability of applications and services


Requirements:

  • 6-9 years of hands-on experience in Performance, Reliability, and Scalability testing
  • Strong skills in capturing, analyzing, and optimizing performance metrics
  • Expertise in performance testing tools such as Locust, Gatling, k6, etc.
  • Experience working with cloud platforms (Google Cloud, AWS, Azure) and setting up performance testing environments
  • Knowledge of CI/CD deployments and integrating performance testing into pipelines
  • Proficiency in scripting languages (Python, Java, JavaScript) for test automation
  • Hands-on experience with monitoring and observability tools (New Relic, AppDynamics, Prometheus, etc.)
  • Strong knowledge of JVM monitoring, thread analysis, and RESTful services
  • Experience in optimising frontend performance and API performance
  • Ability to deploy applications in Kubernetes and troubleshoot environment issues
  • Excellent problem-solving skills and the ability to troubleshoot customer issues effectively
  • Experience in increasing application/service availability from 99.9% (three 9s) to 99.99% or higher (four/five 9s)


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Eman Khan
Posted by Eman Khan
Remote, Delhi
4 - 7 yrs
Best in industry
skill iconPython
skill iconJava
Locust
Gatling
K6
+10 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website: https://www.gohighlevel.com/

YouTube Channel: https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post: https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage.


About the Role:

HighLevel Inc. is looking for a SDET III with 5-6 years of experience to play a crucial role in ensuring the quality, performance, and scalability of our products. We are seeking engineers who thrive in a fast-paced startup environment, enjoy problem-solving, and stay updated with the latest models and solutions. This is a great opportunity to work on cutting-edge performance testing strategies and contribute to the success of our products.


Responsibilities:

  • Implement performance, scalability, and reliability testing strategies
  • Capture and analyze key performance metrics to identify bottlenecks
  • Work closely with development, DevOps, and infrastructure teams to optimize system performance
  • Develop test strategies based on customer behavior to ensure high-performing applications
  • Automate the capturing of performance metrics and generate performance trend reports
  • Collaborate with developers and architects to optimize frontend and API performance
  • Conduct root cause analysis of performance issues using logs and monitoring tools
  • Research, evaluate, and conduct PoCs for new tools and solutions
  • Ensure high availability and reliability of applications and services


Requirements:

  • 4-7 years of hands-on experience in Performance, Reliability, and Scalability testing
  • Strong skills in capturing, analyzing, and optimizing performance metrics
  • Expertise in performance testing tools such as Locust, Gatling, k6, etc.
  • Experience working with cloud platforms (Google Cloud, AWS, Azure) and setting up performance testing environments
  • Knowledge of CI/CD deployments and integrating performance testing into pipelines
  • Proficiency in scripting languages (Python, Java, JavaScript) for test automation
  • Hands-on experience with monitoring and observability tools (New Relic, AppDynamics, Prometheus, etc.)
  • Strong knowledge of JVM monitoring, thread analysis, and RESTful services
  • Experience in optimizing frontend performance and API performance
  • Ability to deploy applications in Kubernetes and troubleshoot environment issues
  • Excellent problem-solving skills and the ability to troubleshoot customer issues effectively


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Nikita Sinha
Posted by Nikita Sinha
Remote, Delhi
4 - 6 yrs
Upto ₹34L / yr (Varies
)
prometheus
skill icongrafana
ELKI
skill iconKubernetes
Terraform
+6 more

We are looking for a Site Reliability Engineer to join our team and help ensure the availability, performance, and scalability of our critical systems. You will work closely with development and operations teams to automate processes, enhance system reliability, and improve observability.


Requirements:

  • Experience: 4+ years in Site Reliability Engineering, DevOps, or Cloud Infrastructure roles
  • Cloud Expertise: Hands-on experience with GCP and AWS
  • Infrastructure as Code (IaC): Terraform, Helm, or equivalent tools
  • Containerisation & Orchestration: Docker, Kubernetes (GKE)
  • Observability: Experience with Prometheus, Grafana, ELK, OpenTelemetry, or similar monitoring/logging tools
  • Programming/Scripting: Proficiency in Python, Bash, or Shell scripting. Basic understanding of API parsing and JSON manipulation
  • CI/CD Pipelines: Hands-on experience with Jenkins, GitHub Actions, ArgoCD, or similar tools
  • Incident Management: Experience with on-call rotations, SLOs, SLIs, SLAs, Escalation Policies, and incident resolution
  • Databases: Experience in monitoring MongoDB, Redis, ES, Queue based etc


Responsibilities:

  • Develop and improve observability using monitoring, logging, tracing, and alerting tools (Prometheus, Grafana, ELK, OpenTelemetry, etc.)
  • Optimize system performance, troubleshoot incidents, and conduct post-mortems/RCA to prevent future issues
  • Collaborate with developers to enhance application reliability, scalability, and performance
  • Drive cost optimisation efforts in cloud environments.
  • Monitor multiple databases (MongoDB, Redis, ES, Queue based etc.)
Read more
CLOUDSUFI

at CLOUDSUFI

3 recruiters
Ayushi Dwivedi
Posted by Ayushi Dwivedi
Remote only
7 - 16 yrs
₹40L - ₹50L / yr
Generative AI
skill iconMachine Learning (ML)
Google Cloud Platform (GCP)
MLOps
skill iconPython
+3 more

Role - AI Architect

Location - Noida/Remote

Mode - Hybrid - 2 days WFO


As an AI Architect at CLOUDSUFI, you will play a key role in driving our AI strategy for customers in the Oil & Gas, Energy, Manufacturing, Retail, Healthcare, and Fintech sectors. You will be responsible for delivering large-scale AI transformation programs for multinational organizations, preferably Fortune 500 companies. You will also lead a team of 10-25 Data Scientists to ensure successful project execution.


Required Experience

● Minimum 12+ years of experience in Data & Analytics domain and minimum 3 years as AI Architect

● Master’s or Ph.D. in a discipline such as Computer Science, Statistics or Applied Mathematics with an emphasis or thesis work on one or more of the following: deep learning, machine learning, Generative AI and optimization.

● Must have experience of articulating and presenting business transformation journey using AI / Gen AI technology to C-Level Executives

● Proven experience in delivering large-scale AI and GenAI transformation programs for multinational organizations

● Strong understanding of AI and GenAI algorithms and techniques

● Must have hands-on experience in open-source software development and cloud native technologies especially on GCP tech stack

● Proficiency in python and prominent ML packages Proficiency in Neural Networks is desirable, though not essential

● Experience leading and managing teams of Data Scientists, Data Engineers and Data Analysts

● Ability to work independently and as part of a team Additional Skills


(Preferred):

● Experience in the Oil & Gas, Energy, Manufacturing, Retail, Healthcare, or Fintech sectors

● Knowledge of cloud platforms (AWS, Azure, GCP)

● GCP Professional Cloud Architect and GCP Professional Machine Learning Engineer Certification

● Experience with AI frameworks and tools (TensorFlow, PyTorch, Keras)

Read more
The Blue Owls Solutions

at The Blue Owls Solutions

2 candid answers
Apoorvo Chakraborty
Posted by Apoorvo Chakraborty
Remote only
2 - 5 yrs
₹10L - ₹15L / yr
skill iconPython
FastAPI
Generative AI
Cloud Computing

Job Type: Full-time

Location: Remote


Company Description

The Blue Owls Solutions specializes in delivering cutting-edge Generative AI Solutions, AI-Powered Software Development, and comprehensive Data Analytics and Engineering services. Our expertise in End-To-End ML/AI Development ensures that clients benefit from scalable and efficient AI-driven solutions tailored to their unique business needs. We create intelligent voice and text agents, chatbots, and process automation solutions, and our data analytics services provide actionable insights for strategic decision-making. Our mission is to bridge the gap between AI innovation and adoption, delivering value-driven, outcome-based solutions that empower our clients to achieve their business goals.


Role Description

We're seeking an enthusiastic Backend Developer who thrives on solving interesting challenges and building reliable, efficient applications. While basic competency in frontend (React) is sufficient, strong backend skills (Python, FastAPI, SQL, pandas) and cloud-native awareness are essential. The ideal candidate enjoys learning new tech stacks, and enjoys solving problems independently.


Required Skills (In order of importance)

  • Strong proficiency in Python backend development with FastAPI.
  • Familiarity with data analysis using pandas, numpy, and SQL.
  • Familiarity with cloud-native concepts and containerization (Docker).
  • Basic React skills for frontend integration.
  • Excellent problem-solving skills, adaptability, and quick learning abilities.
  • Experience with version control systems (e.g., Git)


Preferred Qualifications:

  • 3+ Years of experience as Backend Engineer
  • Experience with PostgreSQL or other relational databases.
  • Azure Cloud Experience.
  • Experience writing clean, maintainable, and testable code.
  • Experience in AI/ML development is a plus


Why Join Us?

  • Collaborative, remote-first environment.
  • Opportunities for rapid career growth and learning.
  • Competitive Pay.
  • Engaging projects focused on practical problem-solving.
Read more
MyOperator - VoiceTree Technologies

at MyOperator - VoiceTree Technologies

1 video
2 recruiters
Vijay Muthu
Posted by Vijay Muthu
Remote only
2 - 4 yrs
₹5L - ₹7L / yr
skill iconJava
skill iconPython
Selenium
pytest
Cucumber
+14 more

About Us:

Heyo & MyOperator are India’s largest conversational platforms, delivering Call + WhatsApp engagement solutions to over 40,000+ businesses. Trusted by brands like Astrotalk, Lenskart, and Caratlane, we power customer engagement at scale. We support a hybrid work model, foster a collaborative environment, and offer fast-track growth opportunities.


Job Overview:

We are looking for a skilled Quality Analyst with 2-4 years of experience in software quality assurance. The ideal candidate should have a strong understanding of testing methodologies, automation tools, and defect tracking to ensure high-quality software products. This is a fully

remote role.


Key Responsibilities:

● Develop and execute test plans, test cases, and test scripts for software products.

● Conduct manual and automated testing to ensure reliability and performance.

● Identify, document, and collaborate with developers to resolve defects and issues.

● Report testing progress and results to stakeholders and management.

● Improve automation testing processes for efficiency and accuracy.

● Stay updated with the latest QA trends, tools, and best practices.


Requirements Skills:

● 2-4 years of experience in software quality assurance.

● Strong understanding of testing methodologies and automated testing.

● Proficiency in Selenium, Rest Assured, Java, and API Testing (mandatory).

● Familiarity with Appium, JMeter, TestNG, defect tracking, and version control tools.

● Strong problem-solving, analytical, and debugging skills.

● Excellent communication and collaboration abilities.

● Detail-oriented with a commitment to delivering high-quality results.


Why Join Us?

● Fully remote work with flexible hours.

● Exposure to industry-leading technologies and practices.

● Collaborative team culture with growth opportunities.

● Work with top brands and innovative projects.

Read more
Galvix

at Galvix

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Remote only
4 - 7 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
skill iconJava
Selenium
Play Framework
cypress

Responsibilities:

  • Test Planning & Execution: Develop and execute test plans, test cases, and test scripts to verify software functionality, performance, and scalability.
  • Collaboration: Work closely with cross-functional teams (developers, product managers, designers) to understand requirements and provide input on testability and quality aspects.
  • Testing: Conduct both manual and automated testing to validate software functionality and identify potential issues.
  • Regression Testing: Perform regression testing to ensure the stability of new features and enhancements.
  • Bug Identification: Identify, isolate, and document bugs, issues, and defects, and collaborate with the development team to resolve them.
  • Code Reviews: Participate in code reviews and contribute to improving software quality and testing processes.
  • Industry Awareness: Stay updated with industry best practices and emerging trends in QA and testing methodologies.

Requirements:

  • Educational Qualification: Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Experience: 4+ years as a QA Lead Engineer or similar role, preferably in a SaaS or software development environment.
  • QA Expertise: Strong knowledge of software QA methodologies, tools, and processes.
  • Test Planning & Execution: Experience in test planning, test case development, and execution.
  • Testing Frameworks: Familiarity with manual and automated testing frameworks and tools.
  • Automation Skills: Proficiency in at least one programming or scripting language (e.g., Python, JavaScript) for test automation.
  • Technical Knowledge: Solid understanding of web technologies, APIs, and databases.
  • Problem-Solving: Excellent analytical and problem-solving skills with keen attention to detail.
  • Communication: Strong communication skills and ability to work effectively in a remote team environment.

Preferred Qualifications:

  • Test Automation Frameworks: Experience with tools like Playwright, Selenium, or Cypress.
  • Performance Testing: Knowledge of performance testing and load testing methodologies.
  • Agile Methodologies: Familiarity with agile development methodologies (e.g., Scrum) and experience in an Agile/DevOps environment.

Note:

  • This is a remote-only position for candidates based in India.


Read more
OpenIAM

at OpenIAM

4 candid answers
Nikita Sinha
Posted by Nikita Sinha
Remote only
4 - 8 yrs
Upto ₹35L / yr (Varies
)
Identity management
skill iconJava
skill iconPython
okta
SailPoint
+2 more

Key Responsibilities:


IAM Solution Implementation: Lead and execute OpenIAM deployments at enterprise clients, including integration with directories, databases, applications, and cloud platforms.

Identity Governance & Administration (IGA): Implement access reviews, role-based access control (RBAC), and identity lifecycle management to help clients enforce security policies and regulatory compliance.

Consultative Engagement: Work closely with clients to capture requirements, understand business challenges, and design IAM & Identity Governance solutions that align with security and compliance needs.

Architecture & Design: Develop IAM and IGA architectures tailored to customer environments, leveraging best practices from previous IAM implementations.

Configuration & Customization: Configure OpenIAM components, develop custom workflows, and implement automation for identity lifecycle management and governance processes.

Customer Collaboration: Guide clients through workshops, requirement sessions, and technical discussions, ensuring a smooth implementation process.

Technical Troubleshooting: Diagnose and resolve issues related to authentication, authorization, provisioning, governance, and access controls.

Documentation: Create high-quality documentation, including design documents, implementation guides, and customer-facing reports.

Mentorship & Best Practices: Share IAM & IGA best practices with clients and internal teams, mentoring junior engineers when needed.


Required Skills & Experience:


• 4+ years of hands-on IAM experience, implementing solutions from major vendors such as Okta, SailPoint, Saviynt, ForgeRock, Oracle IAM, Ping Identity, or similar.

• Strong understanding of IAM and Identity Governance concepts, including:

Access certification and review processes

Role-based access control (RBAC) and attribute-based access control (ABAC)

Identity lifecycle management and policy enforcement

Separation of duties (SoD) controls and compliance

• Experience working with LDAP directories (OpenLDAP, Active Directory) and database systems (PostgreSQL, MySQL, or similar).

• Proficiency in Linux administration, shell scripting, and troubleshooting IAM-related issues in Linux environments.

• Hands-on experience with Java, JavaScript, and Python for custom development, scripting, or integrations.

• Knowledge of REST APIs, SCIM, SAML, OIDC, and FIDO2.

• Strong problem-solving skills and ability to work independently in a fast-paced consulting environment.

• Excellent communication and interpersonal skills, with the ability to work directly with clients in a consultative manner.

• Strong documentation skills to produce high-quality technical reports and client deliverables.


Preferred Qualifications:


• Prior experience deploying IAM & IGA solutions in cloud environments (AWS, Azure, GCP).

• Knowledge of Kubernetes and containerized applications.

• Experience integrating IAM with enterprise applications such as ServiceNow, Workday, Salesforce, or SAP.

• Previous consulting experience working with enterprise customers.

Read more
Logistic Infotech
Remote only
3 - 6 yrs
₹10L - ₹15L / yr
Artificial Intelligence (AI)
TensorFlow
Generative AI
Chatbot
Retrieval Augmented Generation (RAG)
+3 more

Role: Lead AI Engineer

Exp: 3-6 Years

CTC: 35.00-40.00 LPA

Work Mode :WFH


Mandatory Criteria (Can't be neglected during screening) :

• Need Excellent Communication skills as the company is dealing with US Clients also

• 3+ years in AI development, with experience in multi-agent systems, logistics, or related fields.

• Proven experience in conducting A/B testing and beta testing for AI systems.

• Hands-on experience with CrewAI and LangChain tools.

• Should have hands-on experience working with end-to-end chatbot development, specifically with Agentic and RAG-based chatbots. It is essential that the candidate has been involved in the entire lifecycle of chatbot creation, from design to deployment.

• Should have practical experience with LLM application deployment.

• Proficiency in Python and machine learning frameworks (e.g., TensorFlow, PyTorch).

• Experience in setting up monitoring dashboards with tools like Grafana, Tableau, or similar.

• Proficiency with cloud platforms (AWS, Azure, GCP)



Read more
Adesso India

Adesso India

Agency job
via HashRoot by Deepak S
Remote only
4 - 15 yrs
₹8L - ₹25L / yr
skill iconPython
SQL
skill iconMongoDB
bigquery
skill iconJava

Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job Description

We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.


Responsibilities:

Development and maintenance of data pipelines and automation scripts with Python.

Creation of data queries and optimization of database processes with SQL.

Use of bash scripts for system administration, automation and deployment processes.

Database and cloud technologies.

Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake).

Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular.

Composer (Airflow): Orchestration of data pipelines for ETL processes.

Cloud Functions: Development of serverless functions for data processing and automation.

Cloud Scheduler: Planning and automation of recurring cloud jobs.

Cloud Secret Manager: Secure storage and management of sensitive access data and API keys.

BigQuery: Processing, analyzing and querying large amounts of data in the cloud.

Cloud Storage: Storage and management of structured and unstructured data.

Cloud monitoring: monitoring the performance and stability of cloud-based applications.

Data visualization and reporting.

Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI.


Requirements:

Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.

Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.

Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP.

Combined with cloud storage technologies, cloud monitoring and cloud secret management

Excellent communication skills to effectively collaborate with team members and stakeholders.


Nice-to-Have:

 

Knowledge of agile methodologies and working in cross-functional, collaborative teams.

Skills & Requirements

SQL, BigQuery, GCP, Python, MongoDB, Exasol, Snowflake, Bash scripting, Airflow, Cloud Functions, Cloud Scheduler, Cloud Secret Manager, Cloud Storage, Cloud Monitoring, ETL, Data Pipelines, Power BI, Database Optimization, Cloud-Based BI Solutions, Data Processing, Data Automation, Agile Methodologies, Cross-Functional Collaboration.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
3 - 6 yrs
₹30L - ₹40L / yr
Artificial Intelligence (AI)
skill iconPython
Crew AI
Langchaing
A/B Testing
+4 more

Need Excellent Communication skills as the company is dealing with US Clients also

• 3+ years in AI development, with experience in multi-agent systems, logistics, or related fields.

• Proven experience in conducting A/B testing and beta testing for AI systems.

• Hands-on experience with CrewAI and LangChain tools.

• Should have hands-on experience working with end-to-end chatbot development, specifically with Agentic and RAG-based chatbots. It is essential that the candidate has been involved in the entire lifecycle of chatbot creation, from design to deployment.

• Should have practical experience with LLM application deployment.


• Proficiency in Python and machine learning frameworks (e.g., TensorFlow, PyTorch).


• Experience in setting up monitoring dashboards with tools like Grafana, Tableau, or similar.


• Proficiency with cloud platforms (AWS, Azure, GCP)

Read more
Remote only
5 - 10 yrs
₹8L - ₹10L / yr
skill iconReact.js
skill iconDjango
skill iconPython
skill iconPostgreSQL
Cloud Computing
+1 more

What You'll Do:

  • Designed, architected, and built the core AI-powered application from the ground up.
  • You will join a small team with an unparalleled 'fire-in-belly' to close deliverables and a "hungry for more"
  • Collaborate with visionary founders to define the product roadmap and bring innovative ideas to life.
  • Leverage your expertise in Azure Cloud, full-stack development, and automation to develop scalable, secure, and high-performing solutions.
  • Take ownership of the technical stack, system integrations, and deployment pipelines.
  • Establish best development, testing, and deployment practices to ensure the product's success in a competitive market.

Who You Are:

  • Proficient in full-stack technologies, including React.js, Node.js, Python/Django, and similar frameworks.
  • Tech leader who thrives in startup environments and is fully hands-on in coding and working under minimal supervision.
  • Highly results-driven. (very important)
  • 8+ years of proven experience in the technology product space, including building and scaling applications.
  • Strong knowledge of Cloud platforms, Azure Cloud architecture, and automation tools.
  • Experienced in developing and deploying enterprise-grade applications.
  • Passionate about AI and its potential to transform the enterprise landscape.
  • Eager to solve complex problems and create products with tangible business impact.

Why Join Us?:

  • Be part of a founding team, playing a critical role in shaping a cutting-edge AI product.
  • Collaborate with industry leaders and visionaries who are passionate about innovation.
  • This is an opportunity to grow alongside the company and lead a high-impact engineering team.
  • Equity and ownership in the product’s success.

We want to hear from you if you’re ready to make a real difference, push boundaries, and build something extraordinary!

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Remote, Bengaluru (Bangalore)
5 - 11 yrs
₹5L - ₹25L / yr
skill iconPython
skill iconDjango
skill iconFlask
API
  • Experience in Python
  • Experience in any Framework like Django, and Flask.
  • Primary and Secondary skills - Python, OOPs and Data Structure
  • Good understanding of Rest Api 
  • Familiarity with event-driven programming in Python 
  • Good analytical and troubleshooting skills


Read more
Adesso India
Remote only
4 - 15 yrs
₹10L - ₹27L / yr
skill iconPython
Google Cloud Platform (GCP)
SQL
skill iconMongoDB
skill iconJava

Immediate Joiners Preferred. Notice Period - Immediate to 30 Days


Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".

Only applications received via email will be reviewed. Applications through other channels will not be considered.


About Us

adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.


Job Description

We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.


Responsibilities

  • Development and maintenance of data pipelines and automation scripts with Python
  • Creation of data queries and optimization of database processes with SQL
  • Use of bash scripts for system administration, automation and deployment processes
  • Database and cloud technologies
  • Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
  • Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
  • Composer (Airflow): Orchestration of data pipelines for ETL processes
  • Cloud Functions: Development of serverless functions for data processing and automation
  • Cloud Scheduler: Planning and automation of recurring cloud jobs
  • Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
  • BigQuery: Processing, analyzing and querying large amounts of data in the cloud
  • Cloud Storage: Storage and management of structured and unstructured data
  • Cloud monitoring: monitoring the performance and stability of cloud-based applications
  • Data visualization and reporting
  • Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI


Requirements

  • Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
  • Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
  • Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
  • Combined with cloud storage technologies, cloud monitoring and cloud secret management
  • Excellent communication skills to effectively collaborate with team members and stakeholders.

Nice-to-Have:

  • Knowledge of agile methodologies and working in cross-functional, collaborative teams.
Read more
IT Outsourcing

IT Outsourcing

Agency job
via Wee4 Tech Solutions by Wee TechSolutions
Remote only
5 - 11 yrs
₹8L - ₹12L / yr
skill iconPython
NumPy
skill iconMachine Learning (ML)
skill iconDeep Learning
Natural Language Processing (NLP)
+2 more

Dear Professionals!


We are HiringGENAIML Developer!


Key Skills & Qualifications

  • Strong proficiency in Python, with a focus on GenAI best practices and frameworks.
  • Expertise in machine learning algorithms, data modeling, and model evaluation.
  • Experience with NLP techniques, computer vision, or generative AI.
  • Deep knowledge of LLMs, prompt engineering, and GenAI technologies.
  • Proficiency in data analysis tools like Pandas and NumPy.
  • Hands-on experience with vector databases such as Weaviate or Pinecone.
  • Familiarity with cloud platforms (AWS, Azure, GCP) for AI deployment.
  • Strong problem-solving skills and critical-thinking abilities.
  • Experience with AI model fairness, bias detection, and adversarial testing.
  • Excellent communication skills to translate business needs into technical solutions.


Preferred Qualifications

  • Bachelors or Masters degree in Computer Science, AI, or a related field.
  • Experience with MLOps practices for model deployment and maintenance.
  • Strong understanding of data pipelines, APIs, and cloud infrastructure.
  • Advanced degree in Computer Science, Machine Learning, or a related field (preferred).


Read more
Nyteco

at Nyteco

2 candid answers
1 video
Simran Thind
Posted by Simran Thind
Remote only
2 - 4 yrs
₹15L - ₹20L / yr
Software Testing (QA)
Test Automation (QA)
Appium
Selenium
skill iconPython
+2 more

Elevate our quality assurance through automation


At Jules AI we're on a mission to revolutionize the recycled materials industry with cutting-edge technology solutions. As an Automation Engineer within our QA team, you will play a crucial role in enhancing our product development lifecycle by implementing and optimizing automated testing frameworks. Your work will ensure our software products are reliable, efficient, and meet the highest quality standards before reaching our clients.


What You Will Do


  • Develop Automated Testing Frameworks: Design, build, and maintain automated testing frameworks and systems across various platforms and technologies.
  • Collaborate on Test Planning: Work closely with QA analysts and engineers to understand system requirements and features, translating these into detailed, scalable, and robust automated tests.
  • Continuous Integration and Deployment (CI/CD): Integrate automation tests with CI/CD pipelines, ensuring continuous testing and feedback in the software development lifecycle.
  • Performance and Scalability Testing: Implement automated scripts to test performance and scalability of our software products, identifying bottlenecks and optimization opportunities.
  • Bug Detection and Reporting: Utilize automated tests to detect and document bugs and issues within software products, collaborating with development teams to ensure timely resolutions.
  • Tool and Technology Evaluation: Stay informed on the latest trends and tools in automated testing, recommending and implementing new technologies to improve testing efficiency and effectiveness.
  • Quality Metrics and Reporting: Monitor, analyze, and report on quality metrics generated from automated tests to inform quality improvements and decision-making.


Who We Are Looking For

Skills and Qualifications:

  • Proven Experience: 3+ years of previous experience in QA automation or a similar role, with a strong portfolio of successful automation projects.
  • Technical Proficiency: Expertise in automation tools (e.g., Selenium, Playwright, TestComplete, Appium) and programming languages (e.g., Python, Java, JavaScript).
  • Understanding of QA Methodologies: Deep knowledge of QA methodologies, tools, and processes, with the ability to apply this knowledge to automation practices.
  • Problem-Solving Skills: Excellent analytical and problem-solving skills, with a keen attention to detail.
  • Collaboration and Communication: Strong communication skills and the ability to work collaboratively within the QA team and across departments.
  • Agile Environment Experience: Experience working in an Agile/Scrum development process, with an understanding of its impact on QA and automation practices.


What we offer

Work closely with a global team helping bring automation and technological intelligence to the recycling world.


You can also expect:

  • As a global company, we treasure and encourage diversity, perspective, interest, and representation through inclusivity. The more we have, the better the solution.
  • Connect and work with leading minds from the recycling industry and be part of a growing, energetic global team, across time zones, regions, offices and screens.
  • Exposure to developments and tools within your field ensures evolution in your career and skill building.
  • We adopt a Bring Your Own Device policy and encourage flexibility and freedom in how you work through competitive compensation and yearly appraisals
  • Health insurance coverage, paid vacation days and flexible work hours help you maintain a work-life balance
  • Have the opportunity to network and collaborate in a diverse community.


Are You Ready for the Challenge?

  • Become a key player in our mission to transform the recycling industry through technological excellence. If you're passionate about quality assurance, automation, and making a difference, we'd love to hear from you.


Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
8 - 13 yrs
₹70L - ₹90L / yr
Data engineering
Apache Spark
Apache Kafka
skill iconJava
skill iconPython
+6 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Remote only
3 - 10 yrs
$3K - $4K / yr
skill iconMachine Learning (ML)
skill iconPython
skill iconDjango
skill iconReact.js
skill iconPostgreSQL

We’re Building the Future of Immigration Tech

We are developing a high-performance, AI-driven immigration platform that automates visa assessments and guidance for high-skilled immigrants. Our focus is on speed, accuracy, and scalability—not a flashy UI, but a powerful decision-making engine that delivers real value.

We need top-tier engineers who build for performance over aesthetics. If you love AI, automation, and disrupting old systems, this is for you.

🛠 Open Roles

1️⃣ AI/ML Engineer (Visa Assessment AI)

  • Develop a cutting-edge AI model for visa eligibility assessments.
  • Use NLP to process immigration laws, policies, and case precedents.
  • Optimize for accuracy, efficiency, and scale (real-time processing).

2️⃣ Full-Stack Developer (Lean & Scalable Web App)

  • Build a high-performance, no-frills web app (React/Next.js preferred).
  • Integrate the AI model seamlessly into a secure and scalable backend (Python/Django or Node.js).
  • Implement fast data retrieval for applicant evaluations.

🔍 Who We’re Looking For

✔ AI/ML Engineer: Strong experience in NLP, AI automation, and structured data processing. Experience with TensorFlow/PyTorch/OpenAI APIs is a plus.

✔ Full-Stack Developer: Expertise in React (Next.js preferred), Python/Django, or Node.js. Must prioritize performance & security.

✔ Both: You’re a problem-solver, performance-obsessed, and thrive in lean environments.

💻 Tech Stack (Recommended, Open to Suggestions)

  • AI/ML: Python (FastAPI, TensorFlow, OpenAI APIs, Hugging Face NLP)
  • Frontend: React, Next.js (for speed & SEO)
  • Backend: Python/Django or Node.js (for performance & scalability)
  • Database: PostgreSQL or Firebase


Read more
Remote only
4 - 6 yrs
₹10L - ₹20L / yr
skill iconPython
Automation
Selenium

Location: Remote / Hybrid (Silicon Valley)

? Job Type: Full-Time

? Experience: 4+ years


About Altimate AI


At Altimate AI, we’re revolutionizing enterprise data operations with agentic AI—intelligent AI teammates that seamlessly integrate into existing workflows, helping data teams build pipelines, automate documentation, optimize infrastructure, and accelerate delivery.

Backed by top-tier investors and led by Silicon Valley veterans, we’re on a mission to automate and streamline data workflows, allowing data professionals to focus on innovation rather than repetitive tasks.


Role Overview


We are looking for an SDET (Software Development Engineer in Test) with expertise in Python, automation, data, and AI to help ensure the reliability, performance, and scalability of our AI-powered data solutions. You will work closely with engineering and data science teams to develop test automation frameworks, validate complex AI-driven data pipelines, and integrate testing into CI/CD workflows.


Key Responsibilities


✅ Develop and maintain automation frameworks for testing AI-driven data applications

✅ Design, implement, and execute automated test strategies for data pipelines and infrastructure

✅ Validate AI-driven insights and data transformations to ensure accuracy and reliability

✅ Integrate automated tests into CI/CD pipelines for continuous testing and deployment

✅ Collaborate with engineering and data science teams to improve software quality

✅ Identify performance bottlenecks and optimize automated testing approaches

✅ Ensure data integrity and compliance with industry best practices


Required Skills & Experience


? Strong Python programming skills with experience in test automation (PyTest, Selenium, or similar frameworks)

? Hands-on experience with data testing – validating ETL pipelines, SQL queries, and AI-generated outputs

? Proficiency in modern data stacks (SQL, Snowflake, dbt, Spark, Kafka, etc.)

? Experience with CI/CD tools like Jenkins, GitHub Actions, or GitLab CI

? Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes)

? Excellent problem-solving and analytical skills

? Strong communication skills to work effectively with cross-functional teams


Nice-to-Have (Bonus Points)


⭐ Prior experience in a fast-paced startup environment

⭐ Knowledge of machine learning model validation and AI-driven testing approaches

⭐ Experience with performance testing and security testing for AI applications


Why Join Altimate AI?


? Cutting-Edge AI & Automation – Work with next-gen AI-driven data automation technologies

? High-Impact Role – Be part of an early-stage, fast-growing startup shaping the future of enterprise AI

? Competitive Salary + Equity – Own a meaningful stake in a company with massive potential

? Collaborative Culture – Work with top-tier engineers, AI researchers, and data experts

⚡ Opportunity for Growth – Play a key role in scaling AI-powered data operations

Read more
Pullse
Suhail Joo
Posted by Suhail Joo
Remote only
1 - 8 yrs
₹6L - ₹10L / yr
skill iconPython
Large Language Models (LLM)
FastAPI
Artificial Intelligence (AI)
Generative AI

Are you passionate about building scalable AI-driven systems and leveraging technologies like RAG, prompt engineering, and multi-agentic architectures? Do you have a strong foundation in Python and FastAPI, with experience in integrating AI solutions using the CrewAI framework and Weaviate DB? If so, Pullse is the place for you!

We are looking for an AI Developer with at least one year of experience in AI-driven solutions to join our team. This role involves designing, developing, and optimizing AI-powered backend services using Python, FastAPI, and integrating AI capabilities for advanced tasks.


About Pullse

Pullse is a cutting-edge SaaS startup on a mission to revolutionize customer support with AI-driven solutions. Our platform centralizes support channels, streamlines workflows, and enhances customer experiences with automation. We believe in empowering our team with freedom, transparency, and the opportunity to make a meaningful impact.


The Role

As an AI Developer, you will play a crucial role in designing, developing, and optimizing our AI-powered backend services, ensuring high performance, scalability, and reliability. Your primary responsibilities will include:

  • API Development: Design, build, and maintain scalable APIs using Python and FastAPI.
  • AI Integration: Integrate AI technologies like RAG (Retrieve, Augment, Generate) and CrewAI for advanced data processing and prompt engineering to enhance AI model interactions.
  • Multi-Agent Systems: Develop and implement multi-agentic architectures to simulate complex interactions and decision-making processes.
  • Vector Databases: Work with Weaviate DB to store and query dense vector representations for efficient similarity searches and AI model outputs.
  • Real-Time Functionality: Implement real-time updates using WebSockets or similar technologies.
  • Database Management: Develop and optimize database schemas and queries using PostgreSQL.
  • Collaboration: Collaborate with cross-functional teams to design and implement new features.
  • Code Quality: Ensure high code quality, security, and performance optimization.
  • Testing: Participate in code reviews and contribute to improving engineering processes.


Our Tech Stack

  • Backend: Python, FastAPI
  • Database: PostgreSQL
  • AI Focus: RAG, Prompt Engineering, Multi-Agent Systems, Weaviate DB, CrewAI
  • Real-Time Communication: WebSockets


Who You Are

  • Experience: At least one year of experience in backend development with Python and FastAPI.
  • AI Expertise: Strong experience with AI technologies, including RAG, prompt engineering, and multi-agentic architectures. Familiarity with the CrewAI framework and Weaviate DB is a plus.
  • Database Skills: Familiarity with PostgreSQL and database design principles.
  • Problem-Solver: Analytical thinker with a knack for solving complex challenges.
  • Team Player: Excellent communication skills and a collaborative mindset.
  • Learner: Passion for learning and staying updated with the latest AI technologies.


What We Offer

  • Competitive Salary: Up to INR 10 LPA.
  • Equity: Additional equity options to share in our growth journey.
  • Growth Opportunities: Be part of an early-stage startup where your work directly impacts the product and company.
  • Flexibility: Work remotely with a supportive and dynamic team.
  • AI Focus: Opportunity to work on cutting-edge AI projects and contribute to the future of customer support.


Join us in redefining customer support with AI! 🚀


Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
11 - 18 yrs
₹70L - ₹80L / yr
skill iconJava
skill iconGo Programming (Golang)
skill iconNodeJS (Node.js)
skill iconPython
Apache Kafka
+7 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
11 - 18 yrs
₹50L - ₹70L / yr
skill iconJava
Data engineering
skill iconNodeJS (Node.js)
skill iconPython
skill iconGo Programming (Golang)
+5 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Dyrect
Yogesh Miharia
Posted by Yogesh Miharia
Remote only
3 - 5 yrs
₹6L - ₹13L / yr
skill iconNodeJS (Node.js)
skill iconJavascript
skill iconHTML/CSS
skill iconMongoDB
skill iconPython

Key Responsibilities:

  • Develop and maintain both front-end and back-end components of web applications.
  • Collaborate with product managers, designers, and other developers to build user-friendly features.
  • Write clean, maintainable, and efficient code that adheres to coding standards and best practices.
  • Build reusable code and libraries for future use.
  • Optimize applications for maximum speed and scalability.
  • Implement responsive design to ensure consistent user experience across all devices.
  • Work with databases (SQL/NoSQL) and integrate with third-party services and APIs.
  • Troubleshoot, debug, and optimize application performance.
  • Participate in code reviews, ensuring code quality and consistency across the team.
  • Stay updated on the latest industry trends and best practices in full-stack development.
  • Contribute to an agile development process, attending sprints, standups, and retrospectives.

Required Skills and Qualifications:

  • Proven experience as a Full Stack Developer or similar role.
  • Proficiency in front-end technologies such as HTML, CSS, JavaScript, and modern frameworks (React.js, Next.js, Angular, or Vue.js).
  • Strong experience in back-end technologies such as Node.js, Python, Ruby, Java, or PHP.
  • Familiarity with database technologies (e.g., MySQL, PostgreSQL, MongoDB).
  • Experience with version control systems, particularly Git.
  • Knowledge of RESTful API design and integration.
  • Familiarity with cloud platforms like AWS, Azure, or Google Cloud is a plus.
  • Experience in microservices based architecture is a plus.
  • Strong problem-solving skills and attention to detail.
  • Ability to work independently as well as part of a team.
  • Excellent communication skills, both verbal and written.


Read more
Adesso India

Adesso India

Agency job
via Hashroot by Sruthy R
Remote only
6 - 20 yrs
₹12L - ₹25L / yr
skill iconJava
skill iconSpring Boot
skill iconAmazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
+9 more

Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".

Only applications received via email will be reviewed. Applications through other channels will not be considered.


Overview

Adesso India specialises in optimization of core business processes for organizations. Our focus is on providing state-of-the-art solutions that streamline operations and elevate productivity to new heights.

Comprised of a team of industry experts and experienced technology professionals, we ensure that our software development and implementations are reliable, robust, and seamlessly integrated with the latest technologies. By leveraging our extensive knowledge and skills, we empower businesses to achieve their objectives efficiently and effectively.


Job description

The client’s department DPS, Digital People Solutions, offers a sophisticated portfolio of IT applications, providing a strong foundation for professional and efficient People & Organization (P&O) and Business Management, both globally and locally, for a well-known German company listed on the DAX-40 index, which includes the 40 largest and most liquid companies on the Frankfurt Stock Exchange.

We are seeking talented Java Application Developers to join our dynamic DPS team. In this role, you will design and implement change requests for existing applications or develop new projects using Jakarta EE (Java Enterprise Technologies) and Angular for the frontend. Your responsibilities will include end-to-end process mapping within the HR application landscape, analyzing developed functionalities, and addressing potential issues.

As part of our dynamic international cross-functional team you will be responsible for the design, development and deployment of modern high quality software solutions and applications as an experienced and skilled Full-stack developer. 


Responsibilities:

Design, develop, and maintain the application.

Write clean, efficient, and reusable code.

Implement new features and functionality based on business requirements.

Participate in system and application architecture discussions.

Create technical designs and specifications for new features or enhancements.

Write and execute unit tests to ensure code quality.

Debug and resolve technical issues and software defects.

Conduct code reviews to ensure adherence to best practices.

Identify and fix vulnerabilities to ensure application integrity.

Working with other developers to ensure seamless integration backend and frontend elements.

Collaborating with DevOps teams for deployment and scaling.


Requirements:

Bachelor’s degree in computer science or information technology, or a related field.

Proven experience as a skilled Full-stack developer. Experience in Utilities / Energy domain is appreciated.

Strong experience with Java (Springboot), AWS / Azure or GCP, GitLab and Angular and / or React. Additional technologies like Python, Go, Kotlin, Rust or similar are welcome

Excellent problem-solving and debugging skills. 

Strong communication and collaboration abilities to work effectively in a team environment.


Skills & Requirements

Java, Spring Boot, Jakarta EE, Angular, React, AWS, Azure, GCP, GitLab, Python, Go, Kotlin, Rust, Full-stack Development, Unit Testing, Debugging, Code Review, DevOps, Software Architecture, Microservices, HR Applications, Cloud Computing, Frontend Development, Backend Development, System Integration, Technical Design, Deployment, Problem-Solving, Communication, Collaboration.



Read more
Remote only
7 - 12 yrs
₹25L - ₹40L / yr
Spark
skill iconJava
Apache Kafka
Big Data
Apache Hive
+5 more

Job Title: Big Data Engineer (Java Spark Developer – JAVA SPARK EXP IS MUST)

Location: Chennai, Hyderabad, Pune, Bangalore (Bengaluru) / NCR Delhi

Client: Premium Tier 1 Company

Payroll: Direct Client

Employment Type: Full time / Perm

Experience: 7+ years

 

Job Description:

We are looking for a skilled Big Data Engineers using Java Spark with 7+ years of experience in Big Data / legacy platforms, who can join immediately. Desired candidate should have design, development and optimization of real-time & batch data pipelines experience in Big Data environment at an enterprise scale applications. You will work on building scalable and high-performance data processing solutions, integrating real-time data streams, and building a reliable Data platforms. Strong troubleshooting, performance tuning, and collaboration skills are key for this role.

 

Key Responsibilities:

·      Develop data pipelines using Java Spark and Kafka.

·      Optimize and maintain real-time data pipelines and messaging systems.

·      Collaborate with cross-functional teams to deliver scalable data solutions.

·      Troubleshoot and resolve issues in Java Spark and Kafka applications.

 

Qualifications:

·      Experience in Java Spark is must

·      Knowledge and hands-on experience using distributed computing, real-time data streaming, and big data technologies

·      Strong problem-solving and performance optimization skills

·      Looking for immediate joiners

 

If interested, please share your resume along with the following details

1)    Notice Period

2)    Current CTC

3)    Expected CTC

4)    Have Experience in Java Spark - Y / N (this is must)

5)    Any offers in hand

 

Thanks & Regards,

LION & ELEPHANTS CONSULTANCY PVT LTD TEAM

SINGAPORE | INDIA

 

Read more
Springboard

at Springboard

1 video
Kakali Sharma
Posted by Kakali Sharma
Remote only
2 - 6 yrs
₹20L - ₹30L / yr
skill iconPython
skill iconReact.js
skill iconVue.js
skill iconAngularJS (1.x)
skill iconPostgreSQL
+5 more

Job Description

The Opportunity

The Springboard engineering team is looking for software engineers with strong backend & frontend technical expertise. In this role, you would be responsible for building exciting features aimed at improving our student experience and expanding our student base, using the latest technologies like GenAI, as relevant. You would also contribute to making our platform more robust, flexible and scalable. This is a great opportunity to create a meaningful impact as well as grow in your career.

We are looking for engineers with different levels of experience and expertise. Depending on your proficiency levels, you will join our team as a Software Engineer II, Senior Software Engineer or Lead Software Engineer.

Responsibilities

  • Design and develop features for the Springboard platform, which enriches the learning experience of thousands through human guided learning at scale
  • Own quality and reliability of the product by getting hands on with code and design reviews, debugging complex issues and so on
  • Contribute to the platform architecture through redesign of complex features based on evolving business needs
  • Influence and establish best engineering practices through solid design decisions, processes and tools
  • Provide technical mentoring to team members

You

  • You have experience with web application development, on both, backend and frontend.
  • You have a solid understanding of software design principles and best practices.
  • You have hands-on experience in,
  • Coding and debugging complex systems, with frontend integration.
  • Code review, responsible for production deployments.
  • Building scalable and fault-tolerant applications.
  • Re-architecting / re-designing complex systems / features (i.e. managing technical debt).
  • Defining and following best practices for frontend and backend systems.
  • You have excellent problem solving skills and are comfortable handling ambiguity.
  • You are able to analyze various alternatives and reach optimal decisions. 
  • You are willing to challenge the status quo, express your opinion and drive change.
  • You are able to plan reasonably complex pieces of work and can handle changing priorities, unknowns and challenges with support. You want to contribute to the platform roadmap, aligning with the organization priorities and goals.
  • You enjoy mentoring others and helping them solve challenging problems.
  • You have excellent written and verbal communication skills with the ability to present complex technical information in a clear and concise manner. You are able to communicate with various stakeholders to understand their requirements.
  • You are a proponent of quality - building best practices, introducing new processes and improvements to make the team more efficient.

Non-negotiables

Must have

  • Expertise in Backend development (Python & Django experience preferred)
  • Expertise in Frontend development (AngularJS / ReactJS / VueJS experience preferred)
  • Experience working with SQL databases
  • Experience building multiple significant features for web applications

Good to have

  • Experience with Google Cloud Platform (or any cloud platform)
  • Experience working with any Learning Management System (LMS), such as Canvas
  • Experience working with GenAI ecosystem, including usage of AI tools such as code completion
  • Experience with CI/CD pipelines and applications deployed on Kubernetes
  • Experience with refactoring (redesigning complex systems / features, breaking monolith into services)
  • Experience working with NoSQL databases
  • Experience with Web performance optimization, SEO, Gatsby and FE Analytics
  • Delivery skills, specifically planning open ended projects
  • Mentoring skills

Expectations

  • Able to work with open ended problems and come up with efficient solutions
  • Able to communicate effectively with business stakeholders to clarify requirements for small to medium tasks and own end to end delivery
  • Able to communicate estimations, plan deviations and blockers in an efficient and timely manner to all project stakeholders
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort