Cutshort logo
GPU computing Jobs in Hyderabad

11+ GPU computing Jobs in Hyderabad | GPU computing Job openings in Hyderabad

Apply to 11+ GPU computing Jobs in Hyderabad on CutShort.io. Explore the latest GPU computing Job opportunities across top companies like Google, Amazon & Adobe.

icon
VMax eSolutions India Pvt Ltd
Hyderabad
10 - 15 yrs
₹35L - ₹45L / yr
Generative AI
PEFT (Parameter-Efficient Fine-Tuning)
Voice processing
Artificial Intelligence (AI)
GPU computing
+3 more

We are seeking an experienced AI Architect to design, build, and scale production-ready AI voice conversation agents deployed locally (on-prem / edge / private cloud) and optimized for GPU-accelerated, high-throughput environments.

You will own the end-to-end architecture of real-time voice systems, including speech recognition, LLM orchestration, dialog management, speech synthesis, and low-latency streaming pipelines—designed for reliability, scalability, and cost efficiency.

This role is highly hands-on and strategic, bridging research, engineering, and production infrastructure.


Key Responsibilities

Architecture & System Design

  • Design low-latency, real-time voice agent architectures for local/on-prem deployment
  • Define scalable architectures for ASR → LLM → TTS pipelines
  • Optimize systems for GPU utilization, concurrency, and throughput
  • Architect fault-tolerant, production-grade voice systems (HA, monitoring, recovery)

Voice & Conversational AI

  • Design and integrate:
  • Automatic Speech Recognition (ASR)
  • Natural Language Understanding / LLMs
  • Dialogue management & conversation state
  • Text-to-Speech (TTS)
  • Build streaming voice pipelines with sub-second response times
  • Enable multi-turn, interruptible, natural conversations

Model & Inference Engineering

  • Deploy and optimize local LLMs and speech models (quantization, batching, caching)
  • Select and fine-tune open-source models for voice use cases
  • Implement efficient inference using TensorRT, ONNX, CUDA, vLLM, Triton, or similar

Infrastructure & Production

  • Design GPU-based inference clusters (bare metal or Kubernetes)
  • Implement autoscaling, load balancing, and GPU scheduling
  • Establish monitoring, logging, and performance metrics for voice agents
  • Ensure security, privacy, and data isolation for local deployments

Leadership & Collaboration

  • Set architectural standards and best practices
  • Mentor ML and platform engineers
  • Collaborate with product, infra, and applied research teams
  • Drive decisions from prototype → production → scale

Required Qualifications

Technical Skills

  • 7+ years in software / ML systems engineering
  • 3+ years designing production AI systems
  • Strong experience with real-time voice or conversational AI systems
  • Deep understanding of LLMs, ASR, and TTS pipelines
  • Hands-on experience with GPU inference optimization
  • Strong Python and/or C++ background
  • Experience with Linux, Docker, Kubernetes

AI & ML Expertise

  • Experience deploying open-source LLMs locally
  • Knowledge of model optimization:
  • Quantization
  • Batching
  • Streaming inference
  • Familiarity with voice models (e.g., Whisper-like ASR, neural TTS)

Systems & Scaling

  • Experience with high-QPS, low-latency systems
  • Knowledge of distributed systems and microservices
  • Understanding of edge or on-prem AI deployments

Preferred Qualifications

  • Experience building AI voice agents or call automation systems
  • Background in speech processing or audio ML
  • Experience with telephony, WebRTC, SIP, or streaming audio
  • Familiarity with Triton Inference Server / vLLM
  • Prior experience as Tech Lead or Principal Engineer

What We Offer

  • Opportunity to architect state-of-the-art AI voice systems
  • Work on real-world, high-scale production deployments
  • Competitive compensation and equity (if applicable)
  • High ownership and technical influence
  • Collaboration with top-tier AI and infrastructure talent
Read more
Versatile Commerce LLP

at Versatile Commerce LLP

2 candid answers
Vaishnavi Munrri
Posted by Vaishnavi Munrri
Hyderabad
5 - 7 yrs
₹12L - ₹18L / yr
Saviynt
SailPoint
Cyber Security
SOD
SOAP
+4 more

Position Overview:

We are seeking a highly skilled and motivated Identity and Access Management (IAM) Developer/IAM Engineer to join our dynamic team. The successful candidate will be responsible for designing, implementing, and maintaining robust IAM solutions that ensure secure access to our systems and applications. This role involves collaborating with various teams to understand their requirements and deliver scalable and efficient IAM systems.

Key Responsibilities:

  • The technical developer must comprehend and implement customization based on specifications in business requirement documents or sprint tasks.
  • They should review the solution with the business analyst before presenting it to external stakeholders or deploying it to any production environment.
  • All customizations, fixes and tests are correctly documented with all related scenarios, evidence and stored in the corresponding folders/ tools.
  • All deployments must adhere to application factory guidelines and be validated by team members.
  • Developers are responsible for productive system deployments using the ‘4 eyes principle’, ensuring dual verification. Accurately and methodically complete all required information in predefined templates for cutover plans, system testing, UAT, change requests, and application team communications as instructed.
  • Comprehend existing template content, make necessary updates, ensure team alignment, and address issues before submitting to the application team for review.
  • Maintain frequent communication with team members to report progress, ensure alignment, and promptly address any issues or blockers encountered.
  • Prepare and internally review test data for demo/ UAT sessions with the application team, ensure sessions are recorded and structured logically, and assess all feedback received for potential integration.
  • Assess feasibility, develop configurations, prepare test data, conduct demo and UAT sessions, create production deployment documents, submit change requests, execute go-live, and facilitate handover to Operations.

Required Qualifications:

  • Have a minimum of 2 years of hands-on experience with the identity product (Saviynt EIC 23.x/24.x)
  • Possess comprehensive understanding of Saviynt IGA architecture, with practical experience in application onboarding, workflow implementation, Segregation of Duties (SOD), certifications, and custom jar development.
  • Know and have experience working in agile environments (Scrum), being capable of following the existing protocols and ceremonies that will be part of the day-to-day basis.
  • JSON working knowledge.
  • Build SQL queries when required (MySQL 8.0 as backend)
  • Knowledge of APIs (SOAP, REST)
  • Capable of using tools to consume APIs like Postman or SOAP UI
  • Basic knowledge on directory services and applications like Active Directory, Azure AD, Exchange (online/ on-prem)
Read more
Xebia IT Architects

at Xebia IT Architects

2 recruiters
Vijay S
Posted by Vijay S
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Chennai, Bhopal, Jaipur
10 - 15 yrs
₹30L - ₹40L / yr
Spark
Google Cloud Platform (GCP)
skill iconPython
Apache Airflow
PySpark
+1 more

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.


  • Shift: 2 PM 11 PM
  • Work Mode: Hybrid (3 days a week) across Xebia locations
  • Notice Period: Immediate joiners or those with a notice period of up to 30 days


Key Responsibilities:

  • Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
  • Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
  • Ensure data integrity, consistency, and availability across all systems.
  • Collaborate with data engineers, analysts, and stakeholders to optimize performance.
  • Document standards and best practices for data engineering workflows.

Required Experience:


  • 7-8 years of experience in data engineering, architecture, and pipeline development.
  • Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
  • Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
  • Understanding of Data Lake table formats (Delta, Iceberg, etc.).
  • Proficiency in Python for scripting and automation.
  • Strong problem-solving skills and collaborative mindset.


⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.


Looking forward to your response!


Best regards,

Vijay S

Assistant Manager - TAG

https://www.linkedin.com/in/vijay-selvarajan/

Read more
Cornertree

at Cornertree

1 recruiter
Deepesh Shrimal
Posted by Deepesh Shrimal
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 10 yrs
₹15L - ₹30L / yr
Cassandra
PySpark
Data engineering
Big Data
Hadoop
+3 more

Skills:

Experience with Cassandra, including installing configuring and monitoring a Cassandra cluster.

Experience with Cassandra data modeling and CQL scripting. Experience with DataStax Enterprise Graph

Experience with both Windows and Linux Operating Systems. Knowledge of Microsoft .NET Framework (C#, NETCore).

Ability to perform effectively in a team-oriented environment

Read more
Roboyo
Nikitha Kandaswamy
Posted by Nikitha Kandaswamy
Hyderabad
4 - 6 yrs
₹12L - ₹18L / yr
Software Testing (QA)
Test Automation (QA)
Appium
Selenium
Automation
+1 more

Looking for Mid level to Senior tester. 

Scope is manual + Automation with focus on automation testing with Selenium.


This is an immediate requirment. Kindly apply only if you can join in a weeks time.

Read more
Inovar Tech
Neelima Andugula
Posted by Neelima Andugula
Hyderabad
2 - 5 yrs
₹5L - ₹10L / yr
skill iconDocker
skill iconKubernetes
DevOps
Powershell
gitlab
+2 more

Position: Senior DevOps Engineer (Azure Cloud Infra & Application deployments)   

Location:

Hyderabad  

Hiring a Senior

DevOps engineer having 2 to 5 years of experience.  


Primary Responsibilities   

  

Strong Programming experience in PowerShell and batch

scripts. 


Strong expertise in Azure DevOps, GitLab, CI/CD, Jenkins and

Git Actions, Azure Infrastructure. 


Strong

experience in configuring infrastructure and Deployments application in

Kubernetes, docker & Helm charts, App Services, Server less, SQL database, Cloud

services and Container deployment. 

      

Continues

Integration, deployment, and version control (Git/ADO).  


Strong experience in managing and configuring RBAC, managed identity and

security best practices for cloud environments. 

 

Strong verbal and written communication skills. 

   

Experience with agile development process. 

· 

Good analytical skills  

  

Additional Responsibility.  

 

Familiar with

various design and architecture patterns. 


Work with modern frameworks and design patterns. 


Experience with cloud applications Azure/AWS.  Should have experience in developing solutions and plugins and should have used XRM Toolbox/ToolKit.


Exp in Customer Portal and Fetchxml and Power Apps and Power Automate is good to have.


Read more
Phenom People

at Phenom People

5 recruiters
Srivatsav Chilukoori
Posted by Srivatsav Chilukoori
Hyderabad
3 - 6 yrs
₹10L - ₹18L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconPython
skill iconDeep Learning
+4 more

JOB TITLE - Product Development Engineer - Machine Learning
● Work Location: Hyderabad
● Full-time
 

Company Description

Phenom People is the leader in Talent Experience Marketing (TXM for short). We’re an early-stage startup on a mission to fundamentally transform how companies acquire talent. As a category creator, our goals are two-fold: to educate talent acquisition and HR leaders on the benefits of TXM and to help solve their recruiting pain points.
 

Job Responsibilities:

  • Design and implement machine learning, information extraction, probabilistic matching algorithms and models
  • Research and develop innovative, scalable and dynamic solutions to hard problems
  • Work closely with Machine Learning Scientists (PhDs), ML engineers, data scientists and data engineers to address challenges head-on.
  • Use the latest advances in NLP, data science and machine learning to enhance our products and create new experiences
  • Scale machine learning algorithm that powers our platform to support our growing customer base and increasing data volume
  • Be a valued contributor in shaping the future of our products and services
  • You will be part of our Data Science & Algorithms team and collaborate with product management and other team members
  • Be part of a fast pace, fun-focused, agile team

Job Requirement:

  • 4+ years of industry experience
  • Ph.D./MS/B.Tech in computer science, information systems, or similar technical field
  • Strong mathematics, statistics, and data analytics
  • Solid coding and engineering skills preferably in Machine Learning (not mandatory)
  • Proficient in Java, Python, and Scala
  • Industry experience building and productionizing end-to-end systems
  • Knowledge of Information Extraction, NLP algorithms coupled with Deep Learning
  • Experience with data processing and storage frameworks like Hadoop, Spark, Kafka etc.


Position Summary

We’re looking for a Machine Learning Engineer to join our team of Phenom. We are expecting the below points to full fill this role.

  • Capable of building accurate machine learning models is the main goal of a machine learning engineer
  • Linear Algebra, Applied Statistics and Probability
  • Building Data Models
  • Strong knowledge of NLP
  • Good understanding of multithreaded and object-oriented software development
  • Mathematics, Mathematics and Mathematics
  • Collaborate with Data Engineers to prepare data models required for machine learning models
  • Collaborate with other product team members to apply state-of-the-art Ai methods that include dialogue systems, natural language processing, information retrieval and recommendation systems
  • Build large-scale software systems and numerical computation topics
  • Use predictive analytics and data mining to solve complex problems and drive business decisions
  • Should be able to design the accurate ML end-to-end architecture including the data flows, algorithm scalability, and applicability
  • Tackle situations where problem is unknown and the Solution is unknown
  • Solve analytical problems, and effectively communicate methodologies and results to the customers
  • Adept at translating business needs into technical requirements and translating data into actionable insights
  • Work closely with internal stakeholders such as business teams, product managers, engineering teams, and customer success teams.

Benefits

  • Competitive salary for a startup
  • Gain experience rapidly
  • Work directly with the executive team
  • Fast-paced work environment

 

About Phenom People

At PhenomPeople, we believe candidates (Job seekers) are consumers. That’s why we’re bringing e-commerce experience to the job search, with a view to convert candidates into applicants. The Intelligent Career Site™ platform delivers the most relevant and personalized job search yet, with a career site optimized for mobile and desktop interfaces designed to integrate with any ATS, tailored content selection like Glassdoor reviews, YouTube videos and LinkedIn connections based on candidate search habits and an integrated real-time recruiting analytics dashboard.

 

 Use Company career sites to reach candidates and encourage them to convert. The Intelligent Career Site™ offers a single platform to serve candidates a modern e-commerce experience from anywhere on the globe and on any device.

 We track every visitor that comes to the Company career site. Through fingerprinting technology, candidates are tracked from the first visit and served jobs and content based on their location, click-stream, behavior on site, browser and device to give each visitor the most relevant experience.

 Like consumers, candidates research companies and read reviews before they apply for a job. Through our understanding of the candidate journey, we are able to personalize their experience and deliver relevant content from sources such as corporate career sites, Glassdoor, YouTube and LinkedIn.

 We give you clear visibility into the Company's candidate pipeline. By tracking up to 450 data points, we build profiles for every career site visitor based on their site visit behavior, social footprint and any other relevant data available on the open web.

 Gain a better understanding of Company’s recruiting spending and where candidates convert or drop off from Company’s career site. The real-time analytics dashboard offers companies actionable insights on optimizing source spending and the candidate experience.

 

Kindly explore about the company phenom (https://www.phenom.com/">https://www.phenom.com/)
Youtube - https://www.youtube.com/c/PhenomPeople">https://www.youtube.com/c/PhenomPeople
LinkedIn - https://www.linkedin.com/company/phenompeople/">https://www.linkedin.com/company/phenompeople/

https://www.phenom.com/">Phenom | Talent Experience Management

Read more
Product
Agency job
via Purple Hirez by Aditya K
Hyderabad
1 - 5 yrs
₹3L - ₹12L / yr
Magento
skill iconHTML/CSS
skill iconPHP
  • Must possess a fair, clear understanding of fundamentals and concepts of Magento 1/2, PHP.
  • Must have strong experience in Magento Extension development.
  • Write well-engineered source code that complies with accepted web standards.
  • Strong experience of Magento Best Practices, including experience developing custom extensions and extending third party extensions.
  • Thorough functional and code level knowledge of all Magento products and all relevant commerce technologies

 

 

Read more
Saras Analytics Private Limited
Bhavani Thanga
Posted by Bhavani Thanga
Hyderabad
4 - 10 yrs
₹6L - ₹10L / yr
Technical Writing
Technical Writer
RESTful APIs
Databases
SQL

Job Description

Role- Technical Writer – SaaS Product

 

About Saras Analytics 

 

We are a passionate group of engineers, analysts, data scientists, and domain experts building products and offering services to accelerate the adoption of data as a strategic asset.

 

Our Products

 

Daton is our ETL platform for analysts and developers. Daton replicates data from SaaS platforms such as Google Analytics, Salesforce, Amazon, and Facebook Ads to cloud data warehouses like Amazon Redshift, Google BigQuery, and Snowflake. Daton consolidates data from a variety of data sources into a data warehouse within minutes and allows analysts to focus on generating insights rather than worrying about building and maintaining a data pipeline.


 

Halo is our eCommerce focused Business Intelligence and Analytics platform. Halo helps eCommerce businesses save time and money by automating the replication, transformation, preparation, and quality control of data for reporting and analysis. Halo transforms silos of eCommerce data into actionable data sets and useful reports that help you grow your business. Visit https://sarasanalytics.com/" target="_blank">https://sarasanalytics.com/

 


 

Responsibilities:

·         Responsible for developing technical documentation detailing product features

·         Responsible for creating product manuals, detailed how-to guides and FAQs

·         Responsible for developing and maintaining technical documentation for both external usage and internal guidance across various teams.

·         Responsible for setting up new processes and improve existing technical documentation process

·         Responsible for converting product support questions into how to guides

·         You would be working closely with product managers, product support and engineering teams to gather insights to document

 

Eligibility:

-  2-6 years of experience as a Technical Writer and in product documentation.

-  Familiarity with third-party API integration analysis.

-  Excellent logical, analytical and communication skills to interact with the Development team.

 

 

Requirements

 

 

·         2-6 years of experience as a Technical writer in SAAS products

·         Engineering background in IT or Computer Science

·         Excellent written and oral communication skills

·         Familiarity with APIs, databases, and SQL

·         Hands on experience with documentation and automation tools

·         Hands on experience with version control tools

·         Ability to grasp complex technical concept and make them easily understandable through documentation while paying attention to detail

·         Prior work experience in E-Commerce domain is a plus

Read more
Careator Technologies Pvt Ltd
Pranita Panda
Posted by Pranita Panda
Hyderabad
3 - 8 yrs
₹8L - ₹15L / yr
Pega
Pega PRPC
Pega Developer
CSA
CSSA
  • Bachelor’s degree in Software Engineering or Computer Science.
  • Proven work experience as a Pega Developer.
  • Advanced knowledge of Pega PRPC 6.x/7.x/8.x.
  • Familiarity with J2EE, XML, Java, JSP, and EJB.
  • Familiarity with Scrum and Agile methodology.
  • Knowledge of coding languages including Angular JS, Java, and JQuery.
  • Knowledge of web technologies including JavaScript, CSS, HTML5, and XML.
  • Excellent project management skills.
  • Ability to troubleshoot complex software issues.
  • Mandatory Certifications: CSA, CSSA
  • Experience/Certifications in any Pega framework is a value-add
  • Healthcare Domain Experience
Read more
Leading Logistics-tech Platform

Leading Logistics-tech Platform

Agency job
via Unnati by shishira srinivasan
Mumbai, Delhi, Hyderabad, Bengaluru (Bangalore)
5 - 10 yrs
₹12L - ₹15L / yr
Sourcing
Sourcing management
Work with a new-age, reliable logistics platform aiming to disrupt on-time delivery with ultimate efficiency! Read more.
 
Our client is a leading intra-city delivery solutions provider, that focuses on sorting out the largely unorganised logistics space in the country. It is also an aggregator of inter-city mini trucks and large transport vehicles for the Retail, Ecommerce and FMCG sectors. Their app is a platform used by their clients and truck owners, providing GPS enable vehicles, 24X7 support, economical pricing and multi-capacity loaders. Truckers can use their location and choose their transport jobs, while the companies get to pick the drivers as per their ratings.
 
With a fleet of over 44000 trucks and clients like Britannia, Bisleri, Amazon, Flipkart, Metro CashnCarry, Gati, Delhivery and more, the 5-year old platform has raised over $20Mn across multiple funding rounds. Founded and led by IIT-KG alumni, the company has operations in major cities across the country and looking to make inroads in other sectors and verticals.
 
As a Regional Sourcing Manager, you will be responsible for owning all aspects of sourcing of a region for the company. You will be working closely with Regional Head and National Sourcing Manager towards creating a high performing team which can deliver targets for a high growth company.

What you will do:

  • Day to day sourcing for the region (Maharashtra and Gujrat)
  • Working closely with business development team and operations team to ensure the client expectations are properly met in terms of time and quality of sourcing
  • Building and developing the required team to achieve the targets
  • Driving efficiencies to maintain sourcing service levels and expanding margins of the region
  • Building and implementing processes to increase the overall efficiency and reliability of the region
  • Participating in cross-functional projects for high priority initiatives and bringing process and data-driven focused approach to support the organization's goals
  • Driving change management to ensure efficient implementations of any new strategic change
  • Collaborating closely with stakeholders like Operations, Finance, Product, Sales and New initiatives with the objective to drive stakeholder success
  • Ensuring excellent customer and partner experience

 

Desired Candidate Profile

What you need to have:

  • Graduates from Tier 1/2 Engineering colleges/Post Graduate from Tier 1/2 Management colleges with good academic performance
  • 4-8 years of frontline Operation/sourcing/channel management experience or P&L Management
  • Strong leadership qualities to lead a large geographically spread team
  • Ability to manage conflicts and good stakeholder management skills
  • Multi-tasking capabilities and inter departmental coordination
  • Product first mindset and relevant experience is a useful plus
  • Experience in channel management in B2C space will be very useful
  • A strong team sport background is a plus (showcasing the long term commitment and rigor)   
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort