

Quantiphi
https://quantiphi.comAbout
Quantiphi is an award-winning AI-first digital engineering company driven by the desire to reimagine and realize transformational opportunities at the heart of the business. Since its inception in 2013, Quantiphi has solved the toughest and most complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve accelerated and quantifiable business results.
Tech stack
Company video

Candid answers by the company
Bengaluru, Mumbai, and Trivandrum
Photos
Connect with the team
Jobs at Quantiphi
We are looking for a hands-on Generative AI Engineer to design, build, and deploy enterprise-grade GenAI platform capabilities across multiple business units.
This role focuses on developing scalable, reusable AI components across the full stack—covering RAG systems, agent orchestration, LLM infrastructure, and GenAIOps—on GCP (primary) and Azure.
This is a core engineering role, not a research or client-facing position.
Key Responsibilities
- Design and build production-ready GenAI systems and platform components
- Develop and deploy RAG pipelines (data ingestion, embeddings, retrieval, APIs)
- Implement agent-based architectures (orchestration, routing, memory, workflows)
- Build and manage LLM infrastructure (model routing, caching, rate limiting, observability)
- Develop scalable APIs and services for AI capabilities
- Implement GenAIOps/MLOps practices (prompt management, evaluation, monitoring, deployment)
- Work with GCP services (Vertex AI, BigQuery, Cloud Run, GKE, Pub/Sub) to deploy solutions
- Ensure AI safety, governance, and compliance (PII protection, guardrails, auditability)
- Collaborate with cross-functional teams to deliver reusable, enterprise-grade solutions
Required Skills & Experience
- Strong hands-on experience in Generative AI and RAG systems (production level)
- Experience building multi-agent or agentic AI systems
- Proficiency in Python and backend/API development
- Hands-on experience with GCP AI/ML ecosystem (Vertex AI, BigQuery, etc.)
- Solid understanding of LLM infrastructure and orchestration layers
- Experience with CI/CD pipelines and Infrastructure as Code (Terraform)
- Knowledge of GenAIOps/MLOps practices and model lifecycle management
- Understanding of AI safety, governance, and compliance
Nice to Have
- Experience with LangChain, LlamaIndex, or similar frameworks
- Familiarity with RAG evaluation tools (RAGAS, DeepEval)
- Knowledge of Knowledge Graphs with RAG
- Experience in multi-cloud environments (GCP + Azure)
- Exposure to BFSI/regulated domains
What We’re Looking For
- Engineers who have built and deployed real-world GenAI systems at scale
- Strong backend and systems-thinking mindset
- Ability to work in fast-paced, enterprise environments
Role Type
- Individual Contributor (IC)
- Platform Engineering / Backend-heavy GenAI role
Build, deploy, and maintain production-grade AI/ML solutions for Fortune 500 enterprise clients on Google Cloud Platform. Hands-on role focused on shipping scalable AI systems across GenAI, agentic workflows, traditional ML, and computer vision.
Key Responsibilities:
Generative AI & Agentic Systems
- Design and build GenAI applications (RAG, agentic workflows, multi-agent systems)
- Develop intelligent systems with memory, planning, and reasoning capabilities
- Implement prompt engineering, context optimization, and evaluation frameworks
- Build observable and reliable multi-agent architectures
Traditional ML & Computer Vision
- Develop ML pipelines (forecasting, recommendation, classification, regression)
- Build production-grade computer vision solutions (document AI, image analysis)
- Perform feature engineering, model optimization, and benchmarking
MLOps & Production Engineering
- Own end-to-end ML lifecycle (CI/CD, testing, versioning, deployment)
- Build scalable APIs, microservices, and data pipelines
- Monitor models, detect drift, and implement A/B testing frameworks
Knowledge Solutions
- Architect knowledge graphs and semantic search systems
- Implement hybrid retrieval (vector + keyword search)
Client Collaboration
- Present technical solutions to enterprise clients
- Collaborate with architects, data engineers, and business teams
Required Skills & Experience
- 3–6 years of hands-on ML Engineering experience
- Strong Python and software engineering fundamentals
- Experience shipping production ML systems on cloud (GCP preferred)
- Experience across GenAI, Traditional ML, Computer Vision
- MLOps experience and RAG-based systems
Preferred
- GCP Professional ML Engineer certification
- Knowledge graphs / semantic search experience
- Experience in regulated industries (Healthcare / BFSI)
- Open-source or technical publications
Role & Responsibilities
- Develop and deliver automation software to build and improve platform functionality
- Ensure reliability, availability, and manageability of applications and cloud platforms
- Champion adoption of Infrastructure as Code (IaC) practices
- Design and build self-service, self-healing, monitoring, and alerting platforms
- Automate development and testing workflows through CI/CD pipelines (Git, Jenkins, SonarQube, Artifactory, Docker containers)
- Build and manage container hosting platforms using Kubernetes
Requirements
- Strong experience deploying and maintaining GCP cloud infrastructure
- Well-versed in service-oriented and cloud-based architecture design patterns
- Knowledge of cloud services including compute, storage, networking, messaging, and automation tools (e.g., CloudFormation/Terraform equivalents)
- Experience with relational and NoSQL databases (Postgres, Cassandra)
- Hands-on experience with automation/configuration tools (Puppet, Chef, Ansible, Terraform)
Additional Skills
- Strong Linux system administration and troubleshooting skills
- Programming/scripting exposure (Bash, Python, Core Java, or Scala)
- CI/CD pipeline experience (Jenkins, Git, Maven, etc.)
- Experience integrating solutions in multi-region environments
- Familiarity with Agile/Scrum/DevOps methodologies
Role & Responsibilities
- Develop and deliver automation software to build and improve platform functionality
- Ensure reliability, availability, and manageability of applications and cloud platforms
- Champion adoption of Infrastructure as Code (IaC) practices
- Design and build self-service, self-healing, monitoring, and alerting platforms
- Automate development and testing workflows through CI/CD pipelines (Git, Jenkins, SonarQube, Artifactory, Docker containers)
- Build and manage container hosting platforms using Kubernetes
Requirements
- Strong experience deploying and maintaining GCP cloud infrastructure
- Well-versed in service-oriented and cloud-based architecture design patterns
- Knowledge of cloud services including compute, storage, networking, messaging, and automation tools (e.g., CloudFormation/Terraform equivalents)
- Experience with relational and NoSQL databases (Postgres, Cassandra)
- Hands-on experience with automation/configuration tools (Puppet, Chef, Ansible, Terraform)
Additional Skills
- Strong Linux system administration and troubleshooting skills
- Programming/scripting exposure (Bash, Python, Core Java, or Scala)
- CI/CD pipeline experience (Jenkins, Git, Maven, etc.)
- Experience integrating solutions in multi-region environments
- Familiarity with Agile/Scrum/DevOps methodologies
We are seeking a skilled Data Engineer to join the AI Platform Capabilities team supporting the UDP Uplift program.
In this role, you will design, build, and test standardized data and AI platform capabilities across a multi-cloud environment (Azure & GCP).
You will collaborate closely with AI use case teams to develop:
- Scalable data pipelines
- Reusable data products
- Foundational data infrastructure
Your work will support advanced AI solutions such as:
- GenAI
- RAG (Retrieval-Augmented Generation)
- Document Intelligence
Key Responsibilities
- Design and develop scalable ETL/ELT pipelines for AI workloads
- Build and optimize data pipelines for structured & unstructured data
- Enable context processing & vector store integrations
- Support streaming data workflows and batch processing
- Ensure adherence to enterprise data models, governance, and security standards
- Collaborate with DataOps, MLOps, Security, and business teams (LBUs)
- Contribute to data lifecycle management for AI platforms
Required Skills
- 5–7 years of hands-on experience in Data Engineering
- Strong expertise in Python and advanced SQL
- Experience with GCP and/or Azure cloud-native data services
- Hands-on experience with PySpark / Spark SQL
- Experience building data pipelines for ML/AI workloads
- Understanding of CI/CD, Git, and Agile methodologies
- Knowledge of data quality, governance, and security practices
- Strong collaboration and stakeholder management skills
Nice-to-Have Skills
- Experience with Vector Databases / Vector Stores (for RAG pipelines)
- Familiarity with MLOps / GenAIOps concepts (feature stores, model registries, prompt management)
- Exposure to Knowledge Graphs / Context Stores / Document Intelligence workflows
- Experience with DBT (Data Build Tool)
- Knowledge of Infrastructure-as-Code (Terraform)
- Experience in multi-cloud deployments (Azure + GCP)
- Familiarity with event-driven systems (Kafka, Pub/Sub) & API integrations
Ideal Candidate Profile
- Strong data engineering foundation with AI/ML exposure
- Experience working in multi-cloud environments
- Ability to build production-grade, scalable data systems
- Comfortable working in cross-functional, fast-paced environments
Responsible for developing, enhancing, modifying, and maintaining chatbot applications in the Global Markets environment. The role involves designing, coding, testing, debugging, and documenting conversational AI solutions, along with supporting activities aligned to the corporate systems architecture.
You will work closely with business partners to understand requirements, analyze data, and deliver optimal, market-ready conversational AI and automation solutions.
Key Responsibilities
- Design, develop, test, debug, and maintain chatbot and virtual agent applications
- Collaborate with business stakeholders to define and translate requirements into technical solutions
- Analyze large volumes of conversational data to improve chatbot accuracy and performance
- Develop automation workflows for data handling and refinement
- Train and optimize chatbots using historical chat logs and user-generated content
- Ensure solutions align with enterprise architecture and best practices
- Document solutions, workflows, and technical designs clearly
Required Skills
- Hands-on experience in developing virtual agents (chatbots/voicebots) and Natural Language Processing (NLP)
- Experience with one or more AI/NLP platforms such as:
- Dialogflow, Amazon Lex, Alexa, Rasa, LUIS, Kore.AI
- Microsoft Bot Framework, IBM Watson, Wit.ai, Salesforce Einstein, Converse.ai
- Strong programming knowledge in Python, JavaScript, or Node.js
- Experience training chatbots using historical conversations or large-scale text datasets
- Practical knowledge of:
- Formal syntax and semantics
- Corpus analysis
- Dialogue management
- Strong written communication skills
- Strong problem-solving ability and willingness to learn emerging technologies
Nice-to-Have Skills
- Understanding of conversational UI and voice-based processing (Text-to-Speech, Speech-to-Text)
- Experience building voice apps for Amazon Alexa or Google Home
- Experience with Test-Driven Development (TDD) and Agile methodologies
- Ability to design and implement end-to-end pipelines for AI-based conversational applications
- Experience in text mining, hypothesis generation, and historical data analysis
- Strong knowledge of regular expressions for data cleaning and preprocessing
- Understanding of API integrations, SSO, and token-based authentication
- Experience writing unit test cases as per project standards
- Knowledge of HTTP, REST APIs, sockets, and web services
- Ability to perform keyword and topic extraction from chat logs
- Experience training and tuning topic modeling algorithms such as LDA and NMF
- Understanding of classical Machine Learning algorithms and appropriate evaluation metrics
- Experience with NLP frameworks such as NLTK and spaCy
We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.
Key Responsibilities
- Collaborate with business users and stakeholders to understand business processes and data requirements
- Design and implement dimensional data models, including fact and dimension tables
- Identify, design, and implement data transformation and cleansing logic
- Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
- Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
- Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
- Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
- Provide high-level design, research, and effort estimates for data integration initiatives
- Provide production support for ETL processes to ensure data availability and SLA adherence
- Analyze and resolve data pipeline and performance issues
- Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
- Translate business requirements into well-defined technical data specifications
- Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
- Define and document BI usage through use cases, prototypes, testing, and deployment
- Support and enhance data governance and data quality processes
- Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
- Train and support business users, IT analysts, and developers
- Lead and collaborate with teams spread across multiple locations
Required Skills & Qualifications
- Bachelor’s degree in Computer Science or a related field, or equivalent work experience
- 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
- Strong expertise in data warehousing concepts, tools, and best practices
- Excellent SQL skills
- Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
- Hands-on experience with Google Cloud Platform (GCP) services, including:
- BigQuery
- Cloud SQL
- Cloud Composer (Airflow)
- Dataflow
- Dataproc
- Cloud Functions
- Google Cloud Storage (GCS)
- Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
- Strong experience integrating data using APIs, XML, JSON, and similar formats
- In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
- Solid understanding of SDLC, Agile, and Scrum methodologies
- Strong problem-solving, multitasking, and organizational skills
- Experience handling large-scale datasets and database design
- Strong verbal and written communication skills
- Experience leading teams across multiple locations
Good to Have
- Experience with SSRS and SSIS
- Exposure to AWS and/or Azure cloud platforms
- Experience working with enterprise BI and analytics tools
Why Join Us
- Opportunity to work on large-scale, enterprise data platforms
- Exposure to modern cloud-native data engineering technologies
- Collaborative environment with strong stakeholder interaction
- Career growth and leadership opportunities
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Company Profile
Quantiphi is an award-winning Applied AI and Big Data software and services company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed.
Some company highlights:
- Quantiphi has seen 2.5x growth YoY since its inception in 2013.
- Winner of the "Machine Learning Partner of the Year" award from Google for two consecutive years - 2017 and 2018.
- Winner of the "Social Impact Partner of the Year" award from Google for 2019.
- Headquartered in Boston, with 700+ data science professionals across different offices.
For more details, visit: our http://www.quantiphi.com/">Website or our https://www.linkedin.com/company/quantiphi/">LinkedIn Page
Job Description
Role: Associate Tech Architect / Tech Architect – ReactJS +Python+AWS
Experience Level: 7-13 Years
Work location: Mumbai & Bangalore
We are looking for an experienced full stack developer( ReactJS and Python ) who can help create dynamic software applications for our clients with their skill set. In this role, you will be responsible for gathering requirements from clients and accordingly write and test scalable code, and develop front end and back-end components.
Technologies worked on:
ReactJS, Python, AWS
Requirement Description:
- Full Stack developer with experience in ReactJS, Python, API Gateway, Fargate and ECS
- Well-experienced in working with tools like Git, Maven, JFrog
- Should have a solid understanding of object-oriented programming (OOP)
- Well-experienced to perform Unit Testing and Integration Testing and have good experience in Agile based development approach
- Expertise in developing enterprise-level web applications and REST and SOAP APIs using MicroServices, with demonstrable production-scale experience
- Demonstrate strong design and programming skills using JSON, Web Services, XML, XSLT, PL/SQL in Unix and Windows environments
- Strong background working with Linux/UNIX environments and strong Shell scripting experience
- Working knowledge with SQL or No SQL databases
- Understand Architecture Requirements and ensure effective design, development, validation, and support activities
- Understanding of core AWS services, uses, and basic AWS architecture best practices
- Proficiency in developing, deploying, and debugging cloud-based applications using AWS
- Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
- Ability to identify key features of AWS services
- Identify bottlenecks and bugs, and recommend solutions by comparing the advantages and disadvantages of custom development
- Should contribute to team meetings, troubleshooting development and production problems across multiple environments and operating platforms
- Execute strong collaboration and communication skills within distributed project teams
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Similar companies
About the company
REConnect Energy is a leading digital energy platform startup focused on climate resilience solutions. Headquartered in Bangalore, India, with offices in London, Gurgaon, and Mumbai, the company has established itself as India's largest tech-enabled service provider in predictive analytics and demand-supply aggregation for the energy sector. REConnect Energy develops AI and Grid Automation software products for renewables and energy utilities, with a core focus on efficient asset and grid management, climate risk mitigation, and real-time asset visibility.
REConnect Energy offers a comprehensive range of services, including predictive analytics for electric utilities, renewable energy forecasting and grid integration, machine learning and AI for energy markets, and an OTC marketplace for clean energy. The company also specializes in environmental markets, renewable energy policies, and energy dispatch and aggregation. Positioned at the forefront of the energy transition, REConnect Energy addresses complex challenges in climate data and analytics, driving innovation in the renewable energy sector.
Jobs
3
About the company
Wishup is India’s largest remote work platform (since 2017), connecting global businesses with top remote professionals in roles such as Virtual Assistants, Operations/Admin Managers, Executive Assistants, Project Managers, Bookkeepers, and Accountants. With a stringent 0.1% acceptance rate, each professional is upskilled and managed via our AI-based remote work tool.
Backed by marquee investors (Orios Ventures, Inflection Point Ventures, 500 Startups, and Tracxn Labs), Wishup’s leadership team includes alumni from premier institutes like IIT Madras, IIM Ahmedabad, IIT Kanpur, and DCE.
Jobs
3
About the company
Actosoft is a software developing and digital marketing company that offers complete IT solutions. We are a part of the digitally-fluent force that’s improving economies, increasing businesses, and creating success stories, just like ours.
Jobs
3
About the company
Jobs
2
About the company
Ven Analytics – Derived from the Sanskrit word वेण् meaning “to know”. We are a media focused Analytics company and we build applications and dashboards beyond imagination. Our services and products are laser focused to solve Analytics problems FOR Broadcasters, Agencies and Advertisers.
Agile Startup
Our agility helps us percolate through the hierarchy and rapidly deploy solutions for bottommost contributor
Best of Both
A through understanding of business and technology sets us apart from our competitors
Young and Vibrant
Being a young organization we are eager to take up new challenges and make sure we custom deliver them as requested
VISION
Our vision is to become the destination AdTech company for the entire M&E sector
MISSION
We are on a perpetual mission to solve big problems and deliver exponential value to our clients
Jobs
4
About the company
At Hunarstreet Technologies Pvt Ltd, we specialize in delivering India’s fastest hiring solutions, tailored to meet the unique needs of businesses across various industries. Our mission is to connect companies with exceptional talent, enabling them to achieve their growth and operational goals swiftly and efficiently.
We are able to achieve a success rate of 87% in relevancy of candidates to the job position and 62% success rate in closing positions shared with us.
Jobs
760
About the company
Hardwin Software Solutions is a full-service software development company that turns business ideas into enterprise-grade digital solutions. With over 14 years of experience, the company operates out of India, the USA, and the Middle East, serving clients ranging from startups to global enterprises.
Hardwin offers a wide range of services including custom web and mobile app development, cloud-based technology transformation, DevOps & SRE, cybersecurity, managed IT services, AI/ML solutions, IoT integration, and data analytics. Their strength lies in delivering scalable, secure, and tailored software products and systems across industries such as finance, healthcare, e-commerce, real estate, logistics, and travel, making them a one-stop partner for end-to-end digital transformation.
Jobs
0
About the company
Mira is building infrastructure for trustless verification of intelligence. Mira’s verification system significantly enhances AI accuracy by mitigating hallucinations and bias, thereby enabling truly reliable and autonomous AI systems.
Jobs
1
About the company
We specialize in both existing and emerging technologies, powering your application to perform optimally.
Our expertise includes Open Source XAMP & WAMP, Java, Php, .Net and more. We also employ several custom and open-source tools
for performance optimization and speeding up software deployment.
Deploying the latest technologies, we deliver solutions that offer high levels of consistency in quality and performance.
Jobs
7
About the company
I2B is a venture studio building AI-first consumer and B2B technology companies. We partner with founders to build scalable enterprise platforms.
Jobs
5








