

Quantiphi
https://quantiphi.comAbout
Quantiphi is an award-winning AI-first digital engineering company driven by the desire to reimagine and realize transformational opportunities at the heart of the business. Since its inception in 2013, Quantiphi has solved the toughest and most complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve accelerated and quantifiable business results.
Tech stack
Company video

Candid answers by the company
Bengaluru, Mumbai, and Trivandrum
Photos
Connect with the team
Jobs at Quantiphi
Role & Responsibilities
- Design, develop, and deliver automation solutions to enhance platform functionality and reliability.
- Deploy, manage, and maintain Azure cloud infrastructure ensuring high availability, scalability, and security.
- Champion and implement Infrastructure as Code (IaC) practices using Terraform.
- Build and maintain containerized environments using Kubernetes (AKS).
- Develop self-service, self-healing, monitoring, and alerting systems for cloud platforms.
- Automate development, testing, and deployment workflows using CI/CD pipelines.
- Integrate DevOps tools such as Git, Jenkins/Azure DevOps, SonarQube, Artifactory, and Docker to streamline delivery pipelines.
- Ensure platform observability through monitoring, logging, and alerting frameworks.
Requirements
- 6+ years of experience in Cloud / DevOps / Platform Engineering roles.
- Strong hands-on experience with Microsoft Azure cloud infrastructure.
- Experience working with Azure services such as compute, networking, storage, identity, and messaging services.
- Expertise in container orchestration using Kubernetes (AKS).
- Strong experience implementing Infrastructure as Code using Terraform.
- Familiarity with cloud-native and microservices architecture patterns.
- Experience with relational and NoSQL databases such as PostgreSQL or Cassandra.
Additional Skills
- Strong Linux administration and troubleshooting skills.
- Programming or scripting experience in Bash, Python, Java, or similar languages.
- Hands-on experience with CI/CD tools such as Jenkins, Git, Maven, or Azure DevOps.
- Experience managing multi-region or high-availability cloud environments.
- Familiarity with Agile / Scrum / DevOps practices and collaboration tools.

We are hiring an Associate Technical Architect with strong expertise in Azure-based data platforms to design scalable data lakes, data warehouses, and enterprise data pipelines, while working with global teams.
Key Responsibilities
- Design and implement scalable data lake, data warehouse, and lakehouse architectures on Azure
- Build resilient data pipelines using Azure services
- Architect and optimize cloud-based data platforms
- Improve large-scale data processing and query performance
- Collaborate with engineering teams, QA, product managers, and stakeholders
- Communicate technical roadmap, risks, and mitigation strategies
Must-Have Skills:
- 6+ years of experience in Azure Data Engineering / Data Architecture
Azure Data Platform
- Experience with Azure Data Factory
- Hands-on with Azure Databricks and PySpark
- Experience with Azure Data Lake Storage
- Knowledge of Azure Synapse or Azure SQL for data warehousing
Programming & Data Skills
- Strong programming skills in Python and PySpark
- Advanced SQL with query optimization and performance tuning
- Experience building ETL / ELT data pipelines
Data Architecture Knowledge
- Understanding of MPP databases
- Knowledge of partitioning, indexing, and performance optimization
- Experience with data modeling (dimensional, normalized, lakehouse)
Cloud Fundamentals
- Azure security, networking, scalability, and disaster recovery
- Experience with on-premise to Azure migrations
Certification (Preferred)
- Azure Data Engineer or Azure Solutions Architect certification
Good-to-Have Skills
- Domain experience in FSI, Retail, or CPG
- Exposure to data governance tools
- Experience with BI tools such as Power BI or Tableau
- Familiarity with Terraform, CI/CD pipelines, or Azure DevOps
- Experience with NoSQL databases such as Cosmos DB or MongoDB
Soft Skills
- Strong problem-solving and analytical thinking
- Good communication and stakeholder management
- Ability to translate technical concepts into business outcomes
- Experience working with global or distributed teams
Responsible for developing, enhancing, modifying, and maintaining chatbot applications in the Global Markets environment. The role involves designing, coding, testing, debugging, and documenting conversational AI solutions, along with supporting activities aligned to the corporate systems architecture.
You will work closely with business partners to understand requirements, analyze data, and deliver optimal, market-ready conversational AI and automation solutions.
Key Responsibilities
- Design, develop, test, debug, and maintain chatbot and virtual agent applications
- Collaborate with business stakeholders to define and translate requirements into technical solutions
- Analyze large volumes of conversational data to improve chatbot accuracy and performance
- Develop automation workflows for data handling and refinement
- Train and optimize chatbots using historical chat logs and user-generated content
- Ensure solutions align with enterprise architecture and best practices
- Document solutions, workflows, and technical designs clearly
Required Skills
- Hands-on experience in developing virtual agents (chatbots/voicebots) and Natural Language Processing (NLP)
- Experience with one or more AI/NLP platforms such as:
- Dialogflow, Amazon Lex, Alexa, Rasa, LUIS, Kore.AI
- Microsoft Bot Framework, IBM Watson, Wit.ai, Salesforce Einstein, Converse.ai
- Strong programming knowledge in Python, JavaScript, or Node.js
- Experience training chatbots using historical conversations or large-scale text datasets
- Practical knowledge of:
- Formal syntax and semantics
- Corpus analysis
- Dialogue management
- Strong written communication skills
- Strong problem-solving ability and willingness to learn emerging technologies
Nice-to-Have Skills
- Understanding of conversational UI and voice-based processing (Text-to-Speech, Speech-to-Text)
- Experience building voice apps for Amazon Alexa or Google Home
- Experience with Test-Driven Development (TDD) and Agile methodologies
- Ability to design and implement end-to-end pipelines for AI-based conversational applications
- Experience in text mining, hypothesis generation, and historical data analysis
- Strong knowledge of regular expressions for data cleaning and preprocessing
- Understanding of API integrations, SSO, and token-based authentication
- Experience writing unit test cases as per project standards
- Knowledge of HTTP, REST APIs, sockets, and web services
- Ability to perform keyword and topic extraction from chat logs
- Experience training and tuning topic modeling algorithms such as LDA and NMF
- Understanding of classical Machine Learning algorithms and appropriate evaluation metrics
- Experience with NLP frameworks such as NLTK and spaCy
We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.
Key Responsibilities
- Collaborate with business users and stakeholders to understand business processes and data requirements
- Design and implement dimensional data models, including fact and dimension tables
- Identify, design, and implement data transformation and cleansing logic
- Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
- Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
- Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
- Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
- Provide high-level design, research, and effort estimates for data integration initiatives
- Provide production support for ETL processes to ensure data availability and SLA adherence
- Analyze and resolve data pipeline and performance issues
- Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
- Translate business requirements into well-defined technical data specifications
- Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
- Define and document BI usage through use cases, prototypes, testing, and deployment
- Support and enhance data governance and data quality processes
- Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
- Train and support business users, IT analysts, and developers
- Lead and collaborate with teams spread across multiple locations
Required Skills & Qualifications
- Bachelor’s degree in Computer Science or a related field, or equivalent work experience
- 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
- Strong expertise in data warehousing concepts, tools, and best practices
- Excellent SQL skills
- Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
- Hands-on experience with Google Cloud Platform (GCP) services, including:
- BigQuery
- Cloud SQL
- Cloud Composer (Airflow)
- Dataflow
- Dataproc
- Cloud Functions
- Google Cloud Storage (GCS)
- Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
- Strong experience integrating data using APIs, XML, JSON, and similar formats
- In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
- Solid understanding of SDLC, Agile, and Scrum methodologies
- Strong problem-solving, multitasking, and organizational skills
- Experience handling large-scale datasets and database design
- Strong verbal and written communication skills
- Experience leading teams across multiple locations
Good to Have
- Experience with SSRS and SSIS
- Exposure to AWS and/or Azure cloud platforms
- Experience working with enterprise BI and analytics tools
Why Join Us
- Opportunity to work on large-scale, enterprise data platforms
- Exposure to modern cloud-native data engineering technologies
- Collaborative environment with strong stakeholder interaction
- Career growth and leadership opportunities
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Company Profile
Quantiphi is an award-winning Applied AI and Big Data software and services company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed.
Some company highlights:
- Quantiphi has seen 2.5x growth YoY since its inception in 2013.
- Winner of the "Machine Learning Partner of the Year" award from Google for two consecutive years - 2017 and 2018.
- Winner of the "Social Impact Partner of the Year" award from Google for 2019.
- Headquartered in Boston, with 700+ data science professionals across different offices.
For more details, visit: our http://www.quantiphi.com/">Website or our https://www.linkedin.com/company/quantiphi/">LinkedIn Page
Job Description
Role: Associate Tech Architect / Tech Architect – ReactJS +Python+AWS
Experience Level: 7-13 Years
Work location: Mumbai & Bangalore
We are looking for an experienced full stack developer( ReactJS and Python ) who can help create dynamic software applications for our clients with their skill set. In this role, you will be responsible for gathering requirements from clients and accordingly write and test scalable code, and develop front end and back-end components.
Technologies worked on:
ReactJS, Python, AWS
Requirement Description:
- Full Stack developer with experience in ReactJS, Python, API Gateway, Fargate and ECS
- Well-experienced in working with tools like Git, Maven, JFrog
- Should have a solid understanding of object-oriented programming (OOP)
- Well-experienced to perform Unit Testing and Integration Testing and have good experience in Agile based development approach
- Expertise in developing enterprise-level web applications and REST and SOAP APIs using MicroServices, with demonstrable production-scale experience
- Demonstrate strong design and programming skills using JSON, Web Services, XML, XSLT, PL/SQL in Unix and Windows environments
- Strong background working with Linux/UNIX environments and strong Shell scripting experience
- Working knowledge with SQL or No SQL databases
- Understand Architecture Requirements and ensure effective design, development, validation, and support activities
- Understanding of core AWS services, uses, and basic AWS architecture best practices
- Proficiency in developing, deploying, and debugging cloud-based applications using AWS
- Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
- Ability to identify key features of AWS services
- Identify bottlenecks and bugs, and recommend solutions by comparing the advantages and disadvantages of custom development
- Should contribute to team meetings, troubleshooting development and production problems across multiple environments and operating platforms
- Execute strong collaboration and communication skills within distributed project teams
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Similar companies
About the company
Math bole toh MathonGo
India's best online learning platform for Math.
About MathOngO
Founded in 2018, MathonGo is an EdTech platform built to make cracking engineering entrance exams easier and more accessible for lakhs of students across India. With high-quality video lectures, practice sheets, crash courses, and test series, we’ve helped students consistently score 99+ percentiles in JEE Main, Advanced, and other competitive exams. Our mission is simple: simplify Math learning and maximize results.
About the Founder
MathonGo is led by Anup Gupta (Anup Sir), a seasoned educator with 14+ years of teaching experience. He has personally mentored thousands of students, helping more than 1,000 to secure IIT ranks. His deep expertise and student-first approach shape everything we build ensuring that the platform is not just about content, but about results that change careers.
Milestones
- Helped multiple students achieve in 99+ percentiles in JEE Main.
- Launched comprehensive crash courses and test series across exams like BITSAT, WBJEE, MHTCET, and COMEDK.
- Built a thriving community of learners who trust MathOngO for quality, affordability, and results.
- Recognized as one of the leading Math-focused EdTech platforms in India.
Jobs
9
About the company
Oddr is the legal industry’s only AI-powered invoice-to-cash platform. Oddr’s AI-powered platform centralizes, streamlines and accelerates every step of billing + collections— from bill preparation and delivery to collections and reconciliation - enabling new possibilities in analytics, forecasting, and client service that eliminate revenue leakage and increase profitability in the billing and collections lifecycle.
www.oddr.com
Jobs
9
About the company
Automate Accounts is a technology-driven company dedicated to building intelligent automation solutions that streamline business operations and boost efficiency. We leverage modern platforms and tools to help businesses transform their workflows with cutting-edge solutions.
Jobs
4
About the company
SDS Softwares is a UK-based web development company, which has more than 10+ years of experience in its niche field. The company provides services for web development, web design, mobile app development, eLearning, digital marketing, etc. Our services are not restricted to any particular domain. We have served a large number of verticals, not only with the best quality services but with values as well. Major business domains, which we have targeted yet, include real estate, TTL, health care, logistics, and hospitality.
Jobs
4
About the company
Jobs
0
About the company
Applix is building an AI-native Manufacturing Operating System (mOS) designed to drive Triple Zero performance - Zero Defects, Zero Delays, Zero Waste.
The platform unifies scheduling, root cause analysis, digital work instructions, and supply chain visibility into one intelligent system that helps factories operate in real time, not in hindsight.
Headquartered in Austin with presence in Chicago and Hyderabad, Applix partners with global manufacturers to modernize shop-floor execution through applied AI.
About the Team
Applix brings together operators, engineers, and AI specialists with deep manufacturing and supply chain expertise. The team works closely with enterprise customers, deploying practical, factory-ready solutions that deliver measurable operational impact from day one.
Milestones
- Founded in 2022
- Built the industry’s first AI-native Manufacturing Operating System (mOS)
- Partnering with leading global manufacturers
- Growing industry presence with 4,000+ LinkedIn followers
Jobs
2
About the company
Planet Green is a U.S. leader in the remanufacturing and recycling of name-brand OEM printer cartridges such as HP, Canon, Epson, Brother, Lexmark, and Xerox. We offer over 3,000 models of ink and toner at 30%–70% below retail prices.
Our savings come from sourcing new, unwanted genuine OEM surplus printer cartridges from businesses that no longer need them, while also remanufacturing our own high-quality inkjet cartridges.
All products are backed by our 100% Satisfaction Guarantee.
Jobs
1
About the company
Jobs
5
About the company
Outpilot is an AI-powered outreach engine that identifies decision-makers and produces human-grade outreach at scale for sponsorship and complex B2B.
Jobs
1








