

Quantiphi
https://quantiphi.comAbout
Quantiphi is an award-winning AI-first digital engineering company driven by the desire to reimagine and realize transformational opportunities at the heart of the business. Since its inception in 2013, Quantiphi has solved the toughest and most complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve accelerated and quantifiable business results.
Tech stack
Company video

Candid answers by the company
Bengaluru, Mumbai, and Trivandrum
Photos
Connect with the team
Jobs at Quantiphi
Role & Responsibilities
- Develop and deliver automation software to build and improve platform functionality
- Ensure reliability, availability, and manageability of applications and cloud platforms
- Champion adoption of Infrastructure as Code (IaC) practices
- Design and build self-service, self-healing, monitoring, and alerting platforms
- Automate development and testing workflows through CI/CD pipelines (Git, Jenkins, SonarQube, Artifactory, Docker containers)
- Build and manage container hosting platforms using Kubernetes
Requirements
- Strong experience deploying and maintaining GCP cloud infrastructure
- Well-versed in service-oriented and cloud-based architecture design patterns
- Knowledge of cloud services including compute, storage, networking, messaging, and automation tools (e.g., CloudFormation/Terraform equivalents)
- Experience with relational and NoSQL databases (Postgres, Cassandra)
- Hands-on experience with automation/configuration tools (Puppet, Chef, Ansible, Terraform)
Additional Skills
- Strong Linux system administration and troubleshooting skills
- Programming/scripting exposure (Bash, Python, Core Java, or Scala)
- CI/CD pipeline experience (Jenkins, Git, Maven, etc.)
- Experience integrating solutions in multi-region environments
- Familiarity with Agile/Scrum/DevOps methodologies
We are seeking an experienced Associate Architect to lead technical design and architectural decisions across diverse client projects. This role combines hands-on technical leadership with strategic architectural planning, ensuring scalable, secure, and maintainable enterprise solutions.
Key Responsibilities
- Design end-to-end solutions for complex client requirements across multiple technology stacks
- Define technical standards, best practices, and architectural guidelines
- Conduct architecture reviews and ensure adherence to design principles
- Create and maintain architecture documentation and design artifacts
- Mentor and guide development teams (Java, .NET, Node.js)
- Lead technical discussions and architecture decision-making sessions
- Provide leadership during critical project phases
- Collaborate with clients to understand business needs and constraints
- Ensure solutions meet performance, scalability, and security requirements
- Contribute hands-on to critical components and POCs
- Drive innovation and adoption of emerging technologies
- Enable cross-team knowledge sharing and alignment
- Participate in hiring, technical interviews, and evaluations
- Present architecture solutions to stakeholders and clients
- Support pre-sales and solution design initiatives
Must-Have Skills
Architectural & Leadership Expertise
- Software Architecture: 7+ years designing enterprise applications, microservices, and distributed systems
- Tech Stack Expertise: Strong in at least 2 — Java/Spring, .NET Core/Framework, Node.js
- Design Principles: SOLID, Domain-Driven Design (DDD)
- Architectural Patterns: CQRS, Event Sourcing, Saga
- API Design: REST, GraphQL, versioning, integration patterns
- System Design: Scalability, load balancing, caching, performance optimization
Technical Leadership
- Proven experience mentoring mid-level developers (4–6 years exp)
- Strong code review and architectural governance experience
- Ability to evaluate tools, frameworks, and make strategic decisions
- Experience collaborating with DevOps, QA, and business teams
Cloud & Infrastructure
- Containerization: Docker (advanced)
- Orchestration: Kubernetes
- Cloud Platforms: AWS / Azure / GCP
- DevOps: CI/CD pipelines, Infrastructure as Code, monitoring & observability
Good-to-Have Skills
Advanced Architecture
- Event-driven architecture (Kafka, RabbitMQ, pub-sub systems)
- Database architecture (Polyglot persistence, sharding, CQRS)
- Security (OAuth2, OIDC, Zero Trust architecture)
- Performance engineering (APM tools, load testing, capacity planning)
Business & Client Engagement
- Translate business requirements into scalable technical solutions
- Experience presenting solutions to client stakeholders
- Technical estimation, capacity planning, and risk assessment
- Consulting experience during client acquisition / pre-sales
We are looking for a Mid-Level Java Developer to design, develop, and maintain scalable microservices for diverse client projects. You will work on high-performance enterprise applications, ensuring reliability and seamless deployment in containerized environments.
Key Responsibilities
- Develop and maintain microservices across multiple domains
- Design and implement robust REST APIs based on business requirements
- Write unit and integration tests to ensure high code quality
- Build scalable and portable solutions for multi-environment deployment
- Collaborate with cross-functional teams and client stakeholders
- Adapt to different tech stacks and domain requirements
- Participate in code reviews and enforce coding standards
- Support deployment and troubleshoot issues in client environments
Must-Have Skills
Core Technical Expertise
- Java: 4+ years (Java 8+) — Streams, Lambda, Concurrency, Collections
- Frameworks: Spring Boot, Spring Framework, Spring Security
- REST APIs: Design, development, versioning
- Design Patterns: Factory, Singleton, Observer, Strategy, Command
- Testing: JUnit 5/TestNG, Mockito, integration testing, TDD
Microservices & Deployment
- Containerization: Docker
- Architecture: Microservices, distributed systems, service decomposition
- Design: Environment-agnostic systems, configuration externalization
- Build Tools: Maven / Gradle
Good-to-Have Skills
Advanced Technical
- Kubernetes, Docker orchestration
- Cloud: Alibaba Cloud / Azure / GCP
- Messaging: Apache Kafka / RabbitMQ
- Databases: PostgreSQL, MySQL, MongoDB, Cassandra
- API Gateways: Kong / Spring Cloud Gateway / Zuul
Development & Operations
- CI/CD: Jenkins, GitLab CI/CD
- Spring Cloud (Eureka, Config Server, Circuit Breaker)
- Monitoring: Micrometer, Prometheus, ELK
- Performance tuning & profiling
- Security best practices
Client-Facing
- Experience in service-based organizations
- Ability to adapt across domains
- Knowledge of industry standards & compliance
Role & Responsibilities
- Develop and deliver automation software to build and improve platform functionality
- Ensure reliability, availability, and manageability of applications and cloud platforms
- Champion adoption of Infrastructure as Code (IaC) practices
- Design and build self-service, self-healing, monitoring, and alerting platforms
- Automate development and testing workflows through CI/CD pipelines (Git, Jenkins, SonarQube, Artifactory, Docker containers)
- Build and manage container hosting platforms using Kubernetes
Requirements
- Strong experience deploying and maintaining GCP cloud infrastructure
- Well-versed in service-oriented and cloud-based architecture design patterns
- Knowledge of cloud services including compute, storage, networking, messaging, and automation tools (e.g., CloudFormation/Terraform equivalents)
- Experience with relational and NoSQL databases (Postgres, Cassandra)
- Hands-on experience with automation/configuration tools (Puppet, Chef, Ansible, Terraform)
Additional Skills
- Strong Linux system administration and troubleshooting skills
- Programming/scripting exposure (Bash, Python, Core Java, or Scala)
- CI/CD pipeline experience (Jenkins, Git, Maven, etc.)
- Experience integrating solutions in multi-region environments
- Familiarity with Agile/Scrum/DevOps methodologies
We are seeking a skilled Data Engineer to join the AI Platform Capabilities team supporting the UDP Uplift program.
In this role, you will design, build, and test standardized data and AI platform capabilities across a multi-cloud environment (Azure & GCP).
You will collaborate closely with AI use case teams to develop:
- Scalable data pipelines
- Reusable data products
- Foundational data infrastructure
Your work will support advanced AI solutions such as:
- GenAI
- RAG (Retrieval-Augmented Generation)
- Document Intelligence
Key Responsibilities
- Design and develop scalable ETL/ELT pipelines for AI workloads
- Build and optimize data pipelines for structured & unstructured data
- Enable context processing & vector store integrations
- Support streaming data workflows and batch processing
- Ensure adherence to enterprise data models, governance, and security standards
- Collaborate with DataOps, MLOps, Security, and business teams (LBUs)
- Contribute to data lifecycle management for AI platforms
Required Skills
- 5–7 years of hands-on experience in Data Engineering
- Strong expertise in Python and advanced SQL
- Experience with GCP and/or Azure cloud-native data services
- Hands-on experience with PySpark / Spark SQL
- Experience building data pipelines for ML/AI workloads
- Understanding of CI/CD, Git, and Agile methodologies
- Knowledge of data quality, governance, and security practices
- Strong collaboration and stakeholder management skills
Nice-to-Have Skills
- Experience with Vector Databases / Vector Stores (for RAG pipelines)
- Familiarity with MLOps / GenAIOps concepts (feature stores, model registries, prompt management)
- Exposure to Knowledge Graphs / Context Stores / Document Intelligence workflows
- Experience with DBT (Data Build Tool)
- Knowledge of Infrastructure-as-Code (Terraform)
- Experience in multi-cloud deployments (Azure + GCP)
- Familiarity with event-driven systems (Kafka, Pub/Sub) & API integrations
Ideal Candidate Profile
- Strong data engineering foundation with AI/ML exposure
- Experience working in multi-cloud environments
- Ability to build production-grade, scalable data systems
- Comfortable working in cross-functional, fast-paced environments
We are looking for a Mid-Level .NET Developer to design, develop, and maintain scalable microservices for enterprise applications. The role involves working on high-performance, reliable systems deployed in containerized environments.
Key Responsibilities:
- Develop and maintain scalable .NET microservices
- Build robust Web APIs with proper validation, error handling, and security
- Write unit and integration tests to ensure code quality
- Design portable and environment-agnostic solutions
- Collaborate with cross-functional teams and client stakeholders
- Optimize performance and implement caching strategies
- Follow security best practices for enterprise applications
- Participate in code reviews and maintain coding standards
- Support deployment and troubleshoot issues in client environments
Must-Have Skills:
Core Technical Expertise:
- 4+ years of experience with .NET Core (3.1+) / .NET 5+ and C# (8+)
- Strong hands-on experience with ASP.NET Core Web API & Entity Framework Core
- Experience building REST APIs and middleware
- Strong understanding of SOLID principles, Dependency Injection, Repository pattern
- Experience with unit testing (xUnit / NUnit / MSTest), Moq, integration testing
Microservices & Deployment:
- Hands-on experience with Docker
- Understanding of microservices architecture & distributed systems
- Experience with configuration management (appsettings.json, IConfiguration)
- Knowledge of NuGet and dependency management
Good-to-Have Skills:
Advanced Technical:
- Experience with .NET 6/7/8, Minimal APIs, gRPC, SignalR
- Advanced EF Core, Dapper, database migrations
- Kubernetes and container orchestration
- Cloud platforms: Azure / GCP / Alibaba Cloud
- Message brokers: Azure Service Bus, RabbitMQ, Kafka
- Databases: PostgreSQL, MySQL, MongoDB, Cassandra
- API Gateways: Azure API Management, Kong
Development & Operations:
- CI/CD tools: Azure DevOps, Jenkins, GitHub Actions
- Monitoring: Application Insights, Serilog, Prometheus
- Security: HTTPS, CORS, input validation, secure coding
- Background services: Hangfire, Quartz.NET
Client-Facing Experience:
- Experience in service-based organizations
- Ability to adapt to multiple domains
- Understanding of industry standards and compliance
We are looking for a Mid-Level .NET Developer to design, develop, and maintain scalable microservices for enterprise applications. The role involves working on high-performance, reliable systems deployed in containerized environments.
Key Responsibilities:
- Develop and maintain scalable .NET microservices
- Build robust Web APIs with proper validation, error handling, and security
- Write unit and integration tests to ensure code quality
- Design portable and environment-agnostic solutions
- Collaborate with cross-functional teams and client stakeholders
- Optimize performance and implement caching strategies
- Follow security best practices for enterprise applications
- Participate in code reviews and maintain coding standards
- Support deployment and troubleshoot issues in client environments
Must-Have Skills:
Core Technical Expertise:
- 4+ years of experience with .NET Core (3.1+) / .NET 5+ and C# (8+)
- Strong hands-on experience with ASP.NET Core Web API & Entity Framework Core
- Experience building REST APIs and middleware
- Strong understanding of SOLID principles, Dependency Injection, Repository pattern
- Experience with unit testing (xUnit / NUnit / MSTest), Moq, integration testing
Microservices & Deployment:
- Hands-on experience with Docker
- Understanding of microservices architecture & distributed systems
- Experience with configuration management (appsettings.json, IConfiguration)
- Knowledge of NuGet and dependency management
Good-to-Have Skills:
Advanced Technical:
- Experience with .NET 6/7/8, Minimal APIs, gRPC, SignalR
- Advanced EF Core, Dapper, database migrations
- Kubernetes and container orchestration
- Cloud platforms: Azure / GCP / Alibaba Cloud
- Message brokers: Azure Service Bus, RabbitMQ, Kafka
- Databases: PostgreSQL, MySQL, MongoDB, Cassandra
- API Gateways: Azure API Management, Kong
Development & Operations:
- CI/CD tools: Azure DevOps, Jenkins, GitHub Actions
- Monitoring: Application Insights, Serilog, Prometheus
- Security: HTTPS, CORS, input validation, secure coding
- Background services: Hangfire, Quartz.NET
Client-Facing Experience:
- Experience in service-based organizations
- Ability to adapt to multiple domains
- Understanding of industry standards and compliance
Responsible for developing, enhancing, modifying, and maintaining chatbot applications in the Global Markets environment. The role involves designing, coding, testing, debugging, and documenting conversational AI solutions, along with supporting activities aligned to the corporate systems architecture.
You will work closely with business partners to understand requirements, analyze data, and deliver optimal, market-ready conversational AI and automation solutions.
Key Responsibilities
- Design, develop, test, debug, and maintain chatbot and virtual agent applications
- Collaborate with business stakeholders to define and translate requirements into technical solutions
- Analyze large volumes of conversational data to improve chatbot accuracy and performance
- Develop automation workflows for data handling and refinement
- Train and optimize chatbots using historical chat logs and user-generated content
- Ensure solutions align with enterprise architecture and best practices
- Document solutions, workflows, and technical designs clearly
Required Skills
- Hands-on experience in developing virtual agents (chatbots/voicebots) and Natural Language Processing (NLP)
- Experience with one or more AI/NLP platforms such as:
- Dialogflow, Amazon Lex, Alexa, Rasa, LUIS, Kore.AI
- Microsoft Bot Framework, IBM Watson, Wit.ai, Salesforce Einstein, Converse.ai
- Strong programming knowledge in Python, JavaScript, or Node.js
- Experience training chatbots using historical conversations or large-scale text datasets
- Practical knowledge of:
- Formal syntax and semantics
- Corpus analysis
- Dialogue management
- Strong written communication skills
- Strong problem-solving ability and willingness to learn emerging technologies
Nice-to-Have Skills
- Understanding of conversational UI and voice-based processing (Text-to-Speech, Speech-to-Text)
- Experience building voice apps for Amazon Alexa or Google Home
- Experience with Test-Driven Development (TDD) and Agile methodologies
- Ability to design and implement end-to-end pipelines for AI-based conversational applications
- Experience in text mining, hypothesis generation, and historical data analysis
- Strong knowledge of regular expressions for data cleaning and preprocessing
- Understanding of API integrations, SSO, and token-based authentication
- Experience writing unit test cases as per project standards
- Knowledge of HTTP, REST APIs, sockets, and web services
- Ability to perform keyword and topic extraction from chat logs
- Experience training and tuning topic modeling algorithms such as LDA and NMF
- Understanding of classical Machine Learning algorithms and appropriate evaluation metrics
- Experience with NLP frameworks such as NLTK and spaCy

We are hiring an Associate Technical Architect with strong expertise in Azure-based data platforms to design scalable data lakes, data warehouses, and enterprise data pipelines, while working with global teams.
Key Responsibilities
- Design and implement scalable data lake, data warehouse, and lakehouse architectures on Azure
- Build resilient data pipelines using Azure services
- Architect and optimize cloud-based data platforms
- Improve large-scale data processing and query performance
- Collaborate with engineering teams, QA, product managers, and stakeholders
- Communicate technical roadmap, risks, and mitigation strategies
Must-Have Skills:
- 6+ years of experience in Azure Data Engineering / Data Architecture
Azure Data Platform
- Experience with Azure Data Factory
- Hands-on with Azure Databricks and PySpark
- Experience with Azure Data Lake Storage
- Knowledge of Azure Synapse or Azure SQL for data warehousing
Programming & Data Skills
- Strong programming skills in Python and PySpark
- Advanced SQL with query optimization and performance tuning
- Experience building ETL / ELT data pipelines
Data Architecture Knowledge
- Understanding of MPP databases
- Knowledge of partitioning, indexing, and performance optimization
- Experience with data modeling (dimensional, normalized, lakehouse)
Cloud Fundamentals
- Azure security, networking, scalability, and disaster recovery
- Experience with on-premise to Azure migrations
Certification (Preferred)
- Azure Data Engineer or Azure Solutions Architect certification
Good-to-Have Skills
- Domain experience in FSI, Retail, or CPG
- Exposure to data governance tools
- Experience with BI tools such as Power BI or Tableau
- Familiarity with Terraform, CI/CD pipelines, or Azure DevOps
- Experience with NoSQL databases such as Cosmos DB or MongoDB
Soft Skills
- Strong problem-solving and analytical thinking
- Good communication and stakeholder management
- Ability to translate technical concepts into business outcomes
- Experience working with global or distributed teams
We are seeking a Senior Machine Learning Engineer to support the development and deployment of advanced AI capabilities within the PHI ecosystem.
This role focuses on the execution of Generative AI tasks, including model integration and agent deployment. The candidate will be responsible for building RAG-based workflows and ensuring AI interactions remain grounded and accurate using Google Cloud AI tools.
Key Responsibilities
1. GenAI Integration
- Develop and maintain integrations with Gemini 1.5 Pro and Flash models
- Use the Google Gen AI SDK for Python to build and manage model integrations
2. Agent Deployment
- Assist in deploying AI agents to Vertex AI Agent Engine
- Work with the Agent Development Kit (ADK) for agent lifecycle management
3. RAG & Embeddings
- Generate and manage text and multimodal embeddings
- Support semantic search and Retrieval-Augmented Generation (RAG) pipelines
4. Testing & Quality
- Run evaluation scripts to verify model output quality
- Ensure models follow grounding and response accuracy guidelines
Must-Have Skills
- Strong Python programming
- Experience working with REST APIs
- Hands-on experience with Vertex AI Studio
- Experience working with Gemini APIs
- Understanding of Agentic AI concepts
- Familiarity with ADK CLI
- Experience or understanding of RAG architecture
- Knowledge of embedding generation
Good-to-Have Skills (Foundation):
BigQuery
- Basic SQL knowledge
- Experience with data loading
- Ability to debug and troubleshoot queries
Data Streaming
- Familiarity with Google Pub/Sub
- Understanding of synthetic data generation
Visualization
- Basic reporting and dashboards using Looker Studio
Similar companies
About the company
Project-based businesses transform the world we live in. Deltek innovates and delivers software and solutions that power them to achieve their purpose. Our industry-specific software and information solutions maximize our customers' performance at every stage of the project lifecycle by enabling superior levels of project intelligence, management and collaboration.
Deltek is the recognized global standard for project-based businesses across government contracting and professional services industries, helping more than 30,000 organizations of all sizes deliver on their mission.
With over 4,200 employees worldwide, our team of industry experts is passionately committed to creating exceptional customer experiences.
Jobs
12
About the company
Jobs
4
About the company
Founded in 2014 by two passionate individuals during their second year at Christ College, Bangalore, Moshi Moshi is a young, creative, and committed communication company that encourages clients to always "Expect the EXTRA."
Our diverse team of over 160+ people includes Art directors, Cinematographers, Content and copy writers, marketers, developers, coders, and our beloved puppy, Momo. We offer a wide range of services, including strategy, brand design, communications, packaging, film and TVCs, PR, and more. At Moshi Moshi, we believe in creating experiences rather than just running a company.
We are amongst the fastest growing agencies in the country with a very strong value system.
Below are the five of the nine principles we believe in strongly.
- Communicate Clearly.: Prioritize clear and open dialogue.
- Doing things morally right.: Uphold integrity in all endeavors.
- Dream it, do it.: Always Embrace optimism and a can-do attitude.
- Add logic to your life.: Ensure that rationality guides our actions.
- Be that fool.: Fearlessly challenge the impossible.
Come find yourself at Moshi Moshi.
Jobs
146
About the company
At Zenius IT Services, we specialize in delivering top-tier Professional Services for industry-leading platforms such as Avaya, Cisco, Genesys, Amazon Connect, Five9, and NICE inContact.
Our expertise extends to Digital Engineering Solutions powered by AI and Machine Learning, helping businesses drive innovation and achieve excellence.
Jobs
4
About the company
At Hunarstreet Technologies Pvt Ltd, we specialize in delivering India’s fastest hiring solutions, tailored to meet the unique needs of businesses across various industries. Our mission is to connect companies with exceptional talent, enabling them to achieve their growth and operational goals swiftly and efficiently.
We are able to achieve a success rate of 87% in relevancy of candidates to the job position and 62% success rate in closing positions shared with us.
Jobs
750
About the company
Founded in 2015, Phi Commerce has created PayPhi, a ground-breaking omni-channel payment processing platform which processes digital payments at doorstep, online & in-store across variety of form factors such as cards, net-banking, UPI, Aadhaar, BharatQR, wallets, NEFT, RTGS, and NACH. The company was established with the objective to digitize white spaces in payments & go beyond routine payment processing.
Phi Commerce’s PayPhi Digital Enablement suite has been developed with the mission of empowering very large untapped blue-ocean sectors dominated by offline payment modes such as cash & cheque to accept digital payments.
Custom built enablers around PayPhi, help businesses create, present and process digital payments. Our solutions take into consideration legacy systems and complex workflows of businesses, use case, stakeholders in payment ecosystem such as merchants, consumers, banks, networks and ancillary players etc. This uniquely positions us to eliminate friction in first & last mile of payments and create a sustainable digital payment ecosystem.
Jobs
3
About the company
Jobs
14
About the company
Aegion is building the future of workplace automation through intelligent AI agents. We’re creating a new category of workforce solutions that enable companies to? hire AI agents to handle specific job functions, starting as interns and evolving into full contributors within organizations.
We develop specialized AI workforce agents that can be hired just like human employees. These agents integrate seamlessly into existing company workflows, have access to relevant company data and tools, and can perform 60-80% of the work typically done by human employees in specific roles.
Jobs
1
About the company
Mira is building infrastructure for trustless verification of intelligence. Mira’s verification system significantly enhances AI accuracy by mitigating hallucinations and bias, thereby enabling truly reliable and autonomous AI systems.
Jobs
1
About the company
We are on a mission to assist a billion Indians in their spiritual journey and help them feel happier, peaceful & more content.
Our flagship product, Sri Mandir is the world’s largest app for Hindu devotees. It serves as a digital sanctuary for millions of devotees worldwide. Through our app, users can seamlessly participate in online pujas, make chadhava (offerings), and connect with various temples across the country, bringing divine blessings to their lives.
Jobs
5











