

Quantiphi
https://quantiphi.comAbout
Quantiphi is an award-winning AI-first digital engineering company driven by the desire to reimagine and realize transformational opportunities at the heart of the business. Since its inception in 2013, Quantiphi has solved the toughest and most complex business problems by combining deep industry experience, disciplined cloud, and data-engineering practices, and cutting-edge artificial intelligence research to achieve accelerated and quantifiable business results.
Tech stack
Company video

Candid answers by the company
Bengaluru, Mumbai, and Trivandrum
Photos
Connect with the team
Jobs at Quantiphi
Responsible for developing, enhancing, modifying, and maintaining chatbot applications in the Global Markets environment. The role involves designing, coding, testing, debugging, and documenting conversational AI solutions, along with supporting activities aligned to the corporate systems architecture.
You will work closely with business partners to understand requirements, analyze data, and deliver optimal, market-ready conversational AI and automation solutions.
Key Responsibilities
- Design, develop, test, debug, and maintain chatbot and virtual agent applications
- Collaborate with business stakeholders to define and translate requirements into technical solutions
- Analyze large volumes of conversational data to improve chatbot accuracy and performance
- Develop automation workflows for data handling and refinement
- Train and optimize chatbots using historical chat logs and user-generated content
- Ensure solutions align with enterprise architecture and best practices
- Document solutions, workflows, and technical designs clearly
Required Skills
- Hands-on experience in developing virtual agents (chatbots/voicebots) and Natural Language Processing (NLP)
- Experience with one or more AI/NLP platforms such as:
- Dialogflow, Amazon Lex, Alexa, Rasa, LUIS, Kore.AI
- Microsoft Bot Framework, IBM Watson, Wit.ai, Salesforce Einstein, Converse.ai
- Strong programming knowledge in Python, JavaScript, or Node.js
- Experience training chatbots using historical conversations or large-scale text datasets
- Practical knowledge of:
- Formal syntax and semantics
- Corpus analysis
- Dialogue management
- Strong written communication skills
- Strong problem-solving ability and willingness to learn emerging technologies
Nice-to-Have Skills
- Understanding of conversational UI and voice-based processing (Text-to-Speech, Speech-to-Text)
- Experience building voice apps for Amazon Alexa or Google Home
- Experience with Test-Driven Development (TDD) and Agile methodologies
- Ability to design and implement end-to-end pipelines for AI-based conversational applications
- Experience in text mining, hypothesis generation, and historical data analysis
- Strong knowledge of regular expressions for data cleaning and preprocessing
- Understanding of API integrations, SSO, and token-based authentication
- Experience writing unit test cases as per project standards
- Knowledge of HTTP, REST APIs, sockets, and web services
- Ability to perform keyword and topic extraction from chat logs
- Experience training and tuning topic modeling algorithms such as LDA and NMF
- Understanding of classical Machine Learning algorithms and appropriate evaluation metrics
- Experience with NLP frameworks such as NLTK and spaCy

We are hiring an Associate Technical Architect with strong expertise in AWS-based Data Platforms to design scalable data lakes, warehouses, and enterprise data pipelines while working with global teams.
Key Responsibilities
- Design and implement scalable data warehouse, data lake, and lakehouse architectures on AWS
- Build resilient and modular data pipelines using native AWS services
- Architect cloud-based data platforms and evaluate service trade-offs
- Optimize large-scale data processing and query performance
- Collaborate with global cross-functional teams (Engineering, QA, PMs, Stakeholders)
- Communicate technical roadmap, risks, and mitigation strategies
Must-Have Skills
- 8+ years of experience in AWS Data Engineering / Data Architecture
- Hands-on experience with AWS services:
- Amazon S3
- AWS Glue
- AWS Lambda
- Amazon EMR
- AWS Kinesis (Streams & Firehose)
- AWS Step Functions / MWAA
- Amazon Redshift (Spectrum & Serverless)
- Amazon Athena
- Amazon RDS
- AWS Lake Formation
- AWS DMS, EventBridge, SNS, SQS
- Strong programming skills in Python & PySpark
- Advanced SQL with query optimization & performance tuning
- Deep understanding of:
- MPP databases
- Partitioning & indexing strategies
- Data modeling (Dimensional, Normalized, Lakehouse)
- Experience building resilient ETL/data pipelines
- Knowledge of AWS fundamentals:
- Security
- Networking
- Disaster Recovery
- Scalability & resilience
- Experience with on-prem → AWS migrations
- AWS Certification (Solution Architect Associate / Data Engineer Associate)
Good-to-Have Skills
- Domain experience: FSI / Retail / CPG
- Data governance & virtualization tools:
- Collibra
- Denodo
- QuickSight / Power BI / Tableau
- Exposure to:
- Terraform (IaC)
- CI/CD pipelines
- SSIS
- Apache NiFi, Hive, HDFS, Sqoop
- Data Mesh architecture
- Experience with NoSQL databases:
- DynamoDB
- MongoDB
- DocumentDB
Soft Skills
- Strong problem-solving and analytical mindset
- Excellent communication and stakeholder management skills
- Ability to translate technical concepts into business outcomes
- Experience working with distributed/global teams
As a Senior Data Engineer, you will be responsible for building and delivering a Lakehouse-based data pipeline. This is a hands-on role focused on implementing real-time and batch data ingestion, processing, and delivery workflows, while ensuring strong monitoring, observability, and data quality across the entire pipeline.
Must-Have Skills
- 3+ years of hands-on experience building large-scale data pipelines
- Strong experience with Spark Streaming, AWS Glue, and EMR for real-time and batch processing
- Proficiency in PySpark/Python, including building Kafka producers for data ingestion
- Experience working with Confluent Kafka and Spark Streaming for ingestion from on-premise sources
- Solid understanding of AWS services including:
- S3
- Redshift
- Glue
- CloudWatch
- Secrets Manager
- Experience working with Medallion Architecture and hybrid data destinations (e.g., Redshift + on-prem Oracle)
- Ability to implement monitoring dashboards and observability using tools like CloudWatch or Datadog
- Strong SQL skills for data validation and job-level metrics development
- Experience building alerting mechanisms for pipeline failures and performance issues
- Strong collaboration and communication skills
- Proven ownership mindset — driving deliverables from design to deployment
- Experience mentoring junior engineers, conducting code reviews, and guiding best practices
- AWS Certified Data Engineer – Associate (preferred/required)
Good-to-Have Skills
- Experience with orchestration tools such as Apache Airflow or AWS Step Functions
- Exposure to Big Data ecosystem tools:
- Sqoop
- HDFS
- Hive
- NiFi
- Exposure to Terraform for infrastructure automation
- Familiarity with CI/CD pipelines for data workflows
We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.
Key Responsibilities
- Collaborate with business users and stakeholders to understand business processes and data requirements
- Design and implement dimensional data models, including fact and dimension tables
- Identify, design, and implement data transformation and cleansing logic
- Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
- Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
- Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
- Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
- Provide high-level design, research, and effort estimates for data integration initiatives
- Provide production support for ETL processes to ensure data availability and SLA adherence
- Analyze and resolve data pipeline and performance issues
- Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
- Translate business requirements into well-defined technical data specifications
- Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
- Define and document BI usage through use cases, prototypes, testing, and deployment
- Support and enhance data governance and data quality processes
- Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
- Train and support business users, IT analysts, and developers
- Lead and collaborate with teams spread across multiple locations
Required Skills & Qualifications
- Bachelor’s degree in Computer Science or a related field, or equivalent work experience
- 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
- Strong expertise in data warehousing concepts, tools, and best practices
- Excellent SQL skills
- Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
- Hands-on experience with Google Cloud Platform (GCP) services, including:
- BigQuery
- Cloud SQL
- Cloud Composer (Airflow)
- Dataflow
- Dataproc
- Cloud Functions
- Google Cloud Storage (GCS)
- Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
- Strong experience integrating data using APIs, XML, JSON, and similar formats
- In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
- Solid understanding of SDLC, Agile, and Scrum methodologies
- Strong problem-solving, multitasking, and organizational skills
- Experience handling large-scale datasets and database design
- Strong verbal and written communication skills
- Experience leading teams across multiple locations
Good to Have
- Experience with SSRS and SSIS
- Exposure to AWS and/or Azure cloud platforms
- Experience working with enterprise BI and analytics tools
Why Join Us
- Opportunity to work on large-scale, enterprise data platforms
- Exposure to modern cloud-native data engineering technologies
- Collaborative environment with strong stakeholder interaction
- Career growth and leadership opportunities
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Company Profile
Quantiphi is an award-winning Applied AI and Big Data software and services company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed.
Some company highlights:
- Quantiphi has seen 2.5x growth YoY since its inception in 2013.
- Winner of the "Machine Learning Partner of the Year" award from Google for two consecutive years - 2017 and 2018.
- Winner of the "Social Impact Partner of the Year" award from Google for 2019.
- Headquartered in Boston, with 700+ data science professionals across different offices.
For more details, visit: our http://www.quantiphi.com/">Website or our https://www.linkedin.com/company/quantiphi/">LinkedIn Page
Job Description
Role: Associate Tech Architect / Tech Architect – ReactJS +Python+AWS
Experience Level: 7-13 Years
Work location: Mumbai & Bangalore
We are looking for an experienced full stack developer( ReactJS and Python ) who can help create dynamic software applications for our clients with their skill set. In this role, you will be responsible for gathering requirements from clients and accordingly write and test scalable code, and develop front end and back-end components.
Technologies worked on:
ReactJS, Python, AWS
Requirement Description:
- Full Stack developer with experience in ReactJS, Python, API Gateway, Fargate and ECS
- Well-experienced in working with tools like Git, Maven, JFrog
- Should have a solid understanding of object-oriented programming (OOP)
- Well-experienced to perform Unit Testing and Integration Testing and have good experience in Agile based development approach
- Expertise in developing enterprise-level web applications and REST and SOAP APIs using MicroServices, with demonstrable production-scale experience
- Demonstrate strong design and programming skills using JSON, Web Services, XML, XSLT, PL/SQL in Unix and Windows environments
- Strong background working with Linux/UNIX environments and strong Shell scripting experience
- Working knowledge with SQL or No SQL databases
- Understand Architecture Requirements and ensure effective design, development, validation, and support activities
- Understanding of core AWS services, uses, and basic AWS architecture best practices
- Proficiency in developing, deploying, and debugging cloud-based applications using AWS
- Ability to use the AWS service APIs, AWS CLI, and SDKs to write applications
- Ability to identify key features of AWS services
- Identify bottlenecks and bugs, and recommend solutions by comparing the advantages and disadvantages of custom development
- Should contribute to team meetings, troubleshooting development and production problems across multiple environments and operating platforms
- Execute strong collaboration and communication skills within distributed project teams
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency
Similar companies
About the company
IndiWork is a full-service web design, development, and 360-degree digital marketing service provider for various industries. We’re consultants, guides, and digital partners in the online journey, offering customized solutions and expert services tailored to each client. We bring new ideas and best practices combined with the agile methodology to accelerate growth for our clients. We invite you to join us on a journey from now to next.
Jobs
15
About the company
About Us
HighLevel is an AI powered, all-in-one white-label sales & marketing platform that empowers agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. We are proud to support a global and growing community of over 2 million businesses, comprised of agencies, consultants, and businesses of all sizes and industries. HighLevel empowers users with all the tools needed to capture, nurture, and close new leads into repeat customers. As of mid 2025, HighLevel processes over 15 billion API hits and handles more than 2.5 billion message events every day. Our platform manages over 470 terabytes of data distributed across five databases, operates with a network of over 250 microservices, and supports over 1 million domain names.
Our People
With over 1,500 team members across 15+ countries, we operate in a global, remote-first environment. We are building more than software; we are building a global community rooted in creativity, collaboration, and impact. We take pride in cultivating a culture where innovation thrives, ideas are celebrated, and people come first, no matter where they call home.
Our Impact
As of mid 2025, our platform powers over 1.5 billion messages, helps generate over 200 million leads, and facilitates over 20 million conversations for the more than 2 million businesses we serve each month. Behind those numbers are real people growing their companies, connecting with customers, and making their mark - and we get to help make that happen.
EEO Statement:
At HighLevel, we value diversity. In fact, we understand it makes our organisation stronger. We are committed to inclusive hiring/promotion practices that evaluate skill sets, abilities, and qualifications without regard to any characteristic unrelated to performing the job at the highest level. Our objective is to foster an environment where really talented employees from all walks of life can be their true and whole selves, cherished and welcomed for their differences while providing excellent service to our clients and learning from one another along the way! Reasonable accommodations may be made to enable individuals with disabilities to perform essential functions.
Jobs
7
About the company
]eShipz: Simplifying Global Shipping for Businesses: At eShipz, we are revolutionizing how businesses manage their shipping processes. Our platform is designed to offer seamless multi-carrier integration, enabling businesses of all sizes to ship effortlessly across the globe. Whether you're an e-commerce brand, a manufacturer, or a logistics provider, eShipz helps streamline your supply chain with real-time tracking, automated shipping labels, cost-effective shipping rates, and comprehensive reporting.
Our goal is to empower businesses by simplifying logistics, reducing shipping costs, and improving operational efficiency. With an easy-to-use dashboard and a dedicated support team, eShipz ensures that you focus on scaling your business while we handle your shipping needs.
Jobs
16
About the company
Certa’s no-code platform makes it easy to digitize and manage the lifecycle of all your suppliers, partners, and customers. With automated onboarding, contract lifecycle management, and ESG management, Certa eliminates the procurement bottleneck and allows companies to onboard third-parties 3x faster.
Jobs
2
About the company
enParadigm is one of the world's leading experiential learning and talent intelligence companies. We leverage Generative AI & Immersive AI solutions to create hyper-personalised, immersive experiences, driving business impact and behavioural change across levels and functions.
We have been recognized among the fastest growing tech companies in APAC by Deloitte as part of the Deloitte Tech Fast 500 APAC program. We leverage our proprietary simulations, and a rigorous sustained-learning approach. Learn more about our work. We have worked with 500+ organisations around the world such as Coca-Cola, Infosys, P&G, Societe Generale, Colgate-Palmolive, WNS, Citibank, etc, to help drive growth and leadership.
Jobs
2
About the company
Estuate is a global IT services company that offers innovative software solutions ranging from Product Engineering services to Subscription Billing and GRC to Digital Transformation. They help businesses thrive with their out-of-the-box tech solutions and expert consulting services. Estuate's IBM InfoSphere Optim Archive Viewer 11.7 is a powerful and intuitive solution for accessing and analyzing archived data, fully compatible with IBM’s Cloud Pack for Data technology stack.
Estuate's software solutions aid organizations in managing data properly throughout its lifetime, allowing them to make informed decisions and stay compliant with industry regulations. Their IBM InfoSphere Optim Archive family of tools handles older data in active applications and retains data in retired applications for legal, regulatory, or analytical purposes. Industries in which Estuate operates include healthcare, finance, retail, and technology. They have a worldwide presence with operations in several parts of the world including Canada, India, and the UK.
Jobs
5
About the company
Celcom Solutions is a global technology services firm specialising in the telecom and BFSI sectors. Founded in the UK in 2010, the company has expanded into India (notably Chennai & Bengaluru) and supports clients across APAC, MENA, Europe, and the UK. celcomsolutions.com+2EMIS+2
Their core offerings span:
- Greenfield implementations, transformations and managed services for OSS/BSS environments.
- Digital‐transformation, testing, data & analytics services that help telcos and enterprises upgrade to newer models.
- A culture built around subject-matter expertise, global delivery capability and domain experience — making them a solid employer for professionals who want meaningful telecom/IT work.
With a committed global team of service-delivery professionals and consultants, Celcom Solutions offers the opportunity to work on large-scale, complex projects in the telecom / tech space — which makes it an interesting destination if you’re recruiting talent who want scale + domain depth.
Jobs
4
About the company
Jobs
9
About the company
Jobs
2









