11+ QMC Jobs in India
Apply to 11+ QMC Jobs on CutShort.io. Find your next job, effortlessly. Browse QMC Jobs and apply today!
Business Intelligence Consultant – Qlik
Role
· Working through customer specifications and develop solutions in line with defined requirements
· Strategizing and ideating the solution design (create prototypes and/or wireframes) before beginning to create the application or solution.
· Creating load scripts and QVDs to support dashboards.
· Creating data models in Qlik Sense to support dashboards.
· Leading data discovery, assessment, analysis, modeling and mapping efforts for Qlik dashboards.
· Develop visual reports, dashboards and KPI scorecards using Qlik
· Connecting to data sources ( MS SQL SERVER, ORACLE, SAP), importing data and transforming data for Business Intelligence.
· Translating data into informative visuals and reports.
· Developing, publishing and scheduling reports as per the business requirements.
· Implementing application security layer models in Qlik
Skills Required
· Knowledge of data visualization and data analytics principles and skills –including good user experience/UI Design
· Hands-on developer on Qlik Sense development
· Knowledge of writing SQL queries
· Exceptional analytical skills, problem-solving skills and excellent communication skills
Qualifications
1. Degree in Computer Science Engineering disciplines or MCA
2. 2-4 years of hands-on Qlik experience
3. Qlik Sense certification would be preferred
MUST-HAVES:
- LLM, AI, Prompt Engineering LLM Integration & Prompt Engineering
- Context & Knowledge Base Design.
- Context & Knowledge Base Design.
- Experience running LLM evals
NOTICE PERIOD: Immediate – 30 Days
SKILLS: LLM, AI, PROMPT ENGINEERING
NICE TO HAVES:
Data Literacy & Modelling Awareness Familiarity with Databricks, AWS, and ChatGPT Environments
ROLE PROFICIENCY:
Role Scope / Deliverables:
- Scope of Role Serve as the link between business intelligence, data engineering, and AI application teams, ensuring the Large Language Model (LLM) interacts effectively with the modeled dataset.
- Define and curate the context and knowledge base that enables GPT to provide accurate, relevant, and compliant business insights.
- Collaborate with Data Analysts and System SMEs to identify, structure, and tag data elements that feed the LLM environment.
- Design, test, and refine prompt strategies and context frameworks that align GPT outputs with business objectives.
- Conduct evaluation and performance testing (evals) to validate LLM responses for accuracy, completeness, and relevance.
- Partner with IT and governance stakeholders to ensure secure, ethical, and controlled AI behavior within enterprise boundaries.
KEY DELIVERABLES:
- LLM Interaction Design Framework: Documentation of how GPT connects to the modeled dataset, including context injection, prompt templates, and retrieval logic.
- Knowledge Base Configuration: Curated and structured domain knowledge to enable precise and useful GPT responses (e.g., commercial definitions, data context, business rules).
- Evaluation Scripts & Test Results: Defined eval sets, scoring criteria, and output analysis to measure GPT accuracy and quality over time.
- Prompt Library & Usage Guidelines: Standardized prompts and design patterns to ensure consistent business interactions and outcomes.
- AI Performance Dashboard / Reporting: Visualizations or reports summarizing GPT response quality, usage trends, and continuous improvement metrics.
- Governance & Compliance Documentation: Inputs to data security, bias prevention, and responsible AI practices in collaboration with IT and compliance teams.
KEY SKILLS:
Technical & Analytical Skills:
- LLM Integration & Prompt Engineering – Understanding of how GPT models interact with structured and unstructured data to generate business-relevant insights.
- Context & Knowledge Base Design – Skilled in curating, structuring, and managing contextual data to optimize GPT accuracy and reliability.
- Evaluation & Testing Methods – Experience running LLM evals, defining scoring criteria, and assessing model quality across use cases.
- Data Literacy & Modeling Awareness – Familiar with relational and analytical data models to ensure alignment between data structures and AI responses.
- Familiarity with Databricks, AWS, and ChatGPT Environments – Capable of working in cloud-based analytics and AI environments for development, testing, and deployment.
- Scripting & Query Skills (e.g., SQL, Python) – Ability to extract, transform, and validate data for model training and evaluation workflows.
- Business & Collaboration Skills Cross-Functional Collaboration – Works effectively with business, data, and IT teams to align GPT capabilities with business objectives.
- Analytical Thinking & Problem Solving – Evaluates LLM outputs critically, identifies improvement opportunities, and translates findings into actionable refinements.
- Commercial Context Awareness – Understands how sales and marketing intelligence data should be represented and leveraged by GPT.
- Governance & Responsible AI Mindset – Applies enterprise AI standards for data security, privacy, and ethical use.
- Communication & Documentation – Clearly articulates AI logic, context structures, and testing results for both technical and non-technical audiences.
Salesforce Developer + CPQ Billing
Exp: 8+ years
Mode: Remote
Employment: Contract
Job Description:
· Minimum 8+years of Salesforce Development experience (Mandatory).
· Mandate to have CPQ object development experience.
· Experience doing major transformation from Classic to Lightning.
· Strong experience with Lighting Web Component (LWC) programming skills.
· Good experience with Apex Triggers, Batch Classes and @Future Methods, Controllers Calling REST web services from Apex, generating and parsing JSON in Apex Visualforce Pages and Components Visualforce Remoting Effective Apex unit testing, including web service mocking.
· Must have experience in the Sales Cloud (Mandatory), Service Cloud.
· Must have experience in Salesforce integration patterns, including application programming interfaces (API) and bulk data uploads.
· Experience with Release Management, Source Control, and Deployment concepts and technologies such as ANT, SFDC Metadata API, Jenkins, Git (Code Commit) and DevOps in a Salesforce environment.
· Require excellent English communication skills.
Are you a Product Manager with a deep love for pets and a passion for creating exceptional customer experiences? Do you want to be part of a hyper-growth company that truly puts pet parents in control of their mission to give the best to their pets?
Supertails is on a mission to revolutionize pet care in India, and we're looking for a Product Manager to own and scale our customer experience initiatives for the e-commerce division. We are a tech-first company backed by India’s top consumer investors, and we believe the right product, augmented with AI, is the way to solve customer needs.
If you believe technology can transform pet care outcomes and thrive in a hyper-growth environment, this is for you.
Who are we looking for?
You are a first-principles product thinker who understands user journeys and is excited to improve the post-purchase experience through technology. You have a strong pulse on customer behavior, CX, and operational excellence, and have likely worked in B2C e-commerce environments.
You'll be a great fit if you are:
- A pet parent
- Passionate about building intuitive, high-impact user-facing services.
- Someone who thrives in environments with high ownership and cross-functional collaboration.
- A collaborative leader and a passionate advocate for the customer, skilled at aligning cross-functional teams to build the best possible experience.
- A data-savvy product builder who is excited to define the key metrics and build the foundation for a data-driven customer experience.
- Able to analyze user behavior, postulate theories, run experiments, and iterate rapidly.
- You have at least 2-3 years of experience in product management, preferably within e-commerce or retail tech.
Why you'll love this role:
- Be a Pet Parent: Your empathy and firsthand experience with pets are our superpowers—this is a non-negotiable for us!
- Disrupt a Booming Market: Play a key role in a 25% CAGR pet care industry, backed by top investors.
- Impact-Driven: Improve the post-purchase experience, solve customer queries, and empower agents, making a real difference for pet families.
- Ready to combine your product prowess with your passion for pets? Come build the future of pet care with us!
Java R&D Pvt. Ltd. is a Software Development & IT Consulting Company headquartered at VA, USA. We are a leading software development & consulting company providing hi-tech information technology solutions and manpower staffing. Our clients range from Fortune 500 companies to State, Federal, small and medium sized businesses.
Job Title - Technical Interview Coordinator
Location – Bhopal (WFO)
Working Days- 5 (Monday- Friday)
Shift Timings - 12:00pm to 9:00pm IST
Roles and Responsibilities:
- Schedule and coordinate all stages of the technical interview process (phone screens, coding challenges, panel interviews, final rounds, etc.).
- Serve as the main point of contact for candidates throughout the interview process.
- Collaborate with recruiters and hiring managers to understand role requirements and interviewer availability.
- Manage interview logistics, including calendar invites, video conference links, onsite coordination.
- Maintain candidate data and interview status in ATS (Applicant Tracking System).
- Troubleshoot any last-minute changes or technical issues during interviews.
- Monitor and follow up on interview feedback, ensuring timely responses and next steps.
- Support continuous improvement of interview operations through data tracking and process optimization.
About the Role
As one of the key members of the development team, you will have the unique opportunity to redefine the architecture of our suite of products.
You will get to work directly with our founding team to deliver the most valuable and joyful experience to our customers. If you are looking to make a real impact on real people’s lives and accelerate your career to new heights in the meantime, then this is the perfect opportunity for you. You will help in refactoring certain codes to bring greater flexibility and micro service architecture. CurbWaste intends to execute event driven workflow architecture.Benchmark design patterns for security and scalability will need to be implemented.
Requirements
What you will do
• Review current code and anticipate engineering bottlenecks
• Designing and developing REST API interfaces
• Optimize queries
• Design SOLR based search solution
• Code review peer code
• Identifying code libraries and design patterns
What you will need
• Experience building out RESTful APIs for front-end clients
• Basic knowledge of a minimum one modern front-end framework such as React,Polymer, Angular or Vue.js
• Expert level understanding of NodeJS, and frameworks such as ExpressJS, Fast,LoopBack (preferred)
• Experience with a version control tool (we use git - GitHub and BitBucket)
• Familiarity with modern DevOps tools such as Ansible, Docker, Terraform,Fabric, Kubernetes, etc
• SOLR or ElasticSearch experience
• Advanced Knowledge of NoSQL (also SQL) databases - MongoDB, PostgreSQL
• Extensive experience of any caching technologies - Redis (preferred),
Memcached
• Experience with AWS services like Elastic Beanstalk, S3, EC2 Lambda, API Gateway, SQS, etc
• Prior experience in notifications delivery tools - FCM
• Understanding of patterns and techniques for building scalable back-end
infrastructure including caching, rate limiting, authentication, and authorization schemes
• Experience with programming languages such as golang, Typescrip
Sr. Data Engineer (Data Warehouse-Snowflake)
Experience: 5+yrs
Location: Pune (Hybrid)
As a Senior Data engineer with Snowflake expertise you are a subject matter expert who is curious and an innovative thinker to mentor young professionals. You are a key person to convert Vision and Data Strategy for Data solutions and deliver them. With your knowledge you will help create data-driven thinking within the organization, not just within Data teams, but also in the wider stakeholder community.
Skills Preferred
- Advanced written, verbal, and analytic skills, and demonstrated ability to influence and facilitate sustained change. Ability to convey information clearly and concisely to all levels of staff and management about programs, services, best practices, strategies, and organizational mission and values.
- Proven ability to focus on priorities, strategies, and vision.
- Very Good understanding in Data Foundation initiatives, like Data Modelling, Data Quality Management, Data Governance, Data Maturity Assessments and Data Strategy in support of the key business stakeholders.
- Actively deliver the roll-out and embedding of Data Foundation initiatives in support of the key business programs advising on the technology and using leading market standard tools.
- Coordinate the change management process, incident management and problem management process.
- Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
- Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery
Knowledge Preferred
- Extensive knowledge and hands on experience with Snowflake and its different components like User/Group, Data Store/ Warehouse management, External Stage/table, working with semi structured data, Snowpipe etc.
- Implement and manage CI/CD for migrating and deploying codes to higher environments with Snowflake codes.
- Proven experience with Snowflake Access control and authentication, data security, data sharing, working with VS Code extension for snowflake, replication, and failover, optimizing SQL, analytical ability to troubleshoot and debug on development and production issues quickly is key for success in this role.
- Proven technology champion in working with relational, Data warehouses databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Highly Experienced in building and optimizing complex queries. Good with manipulating, processing, and extracting value from large, disconnected datasets.
- Your experience in handling big data sets and big data technologies will be an asset.
- Proven champion with in-depth knowledge of any one of the scripting languages: Python, SQL, Pyspark.
Primary responsibilities
- You will be an asset in our team bringing deep technical skills and capabilities to become a key part of projects defining the data journey in our company, keen to engage, network and innovate in collaboration with company wide teams.
- Collaborate with the data and analytics team to develop and maintain a data model and data governance infrastructure using a range of different storage technologies that enables optimal data storage and sharing using advanced methods.
- Support the development of processes and standards for data mining, data modeling and data protection.
- Design and implement continuous process improvements for automating manual processes and optimizing data delivery.
- Assess and report on the unique data needs of key stakeholders and troubleshoot any data-related technical issues through to resolution.
- Work to improve data models that support business intelligence tools, improve data accessibility and foster data-driven decision making.
- Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
- Manage and lead technical design and development activities for implementation of large-scale data solutions in Snowflake to support multiple use cases (transformation, reporting and analytics, data monetization, etc.).
- Translate advanced business data, integration and analytics problems into technical approaches that yield actionable recommendations, across multiple, diverse domains; communicate results and educate others through design and build of insightful presentations.
- Exhibit strong knowledge of the Snowflake ecosystem and can clearly articulate the value proposition of cloud modernization/transformation to a wide range of stakeholders.
Relevant work experience
Bachelors in a Science, Technology, Engineering, Mathematics or Computer Science discipline or equivalent with 7+ Years of experience in enterprise-wide data warehousing, governance, policies, procedures, and implementation.
Aptitude for working with data, interpreting results, business intelligence and analytic best practices.
Business understanding
Good knowledge and understanding of Consumer and industrial products sector and IoT.
Good functional understanding of solutions supporting business processes.
Skill Must have
- Snowflake 5+ years
- Overall different Data warehousing techs 5+ years
- SQL 5+ years
- Data warehouse designing experience 3+ years
- Experience with cloud and on-prem hybrid models in data architecture
- Knowledge of Data Governance and strong understanding of data lineage and data quality
- Programming & Scripting: Python, Pyspark
- Database technologies such as Traditional RDBMS (MS SQL Server, Oracle, MySQL, PostgreSQL)
Nice to have
- Demonstrated experience in modern enterprise data integration platforms such as Informatica
- AWS cloud services: S3, Lambda, Glue and Kinesis and API Gateway, EC2, EMR, RDS, Redshift and Kinesis
- Good understanding of Data Architecture approaches
- Experience in designing and building streaming data ingestion, analysis and processing pipelines using Kafka, Kafka Streams, Spark Streaming, Stream sets and similar cloud native technologies.
- Experience with implementation of operations concerns for a data platform such as monitoring, security, and scalability
- Experience working in DevOps, Agile, Scrum, Continuous Delivery and/or Rapid Application Development environments
- Building mock and proof-of-concepts across different capabilities/tool sets exposure
- Experience working with structured, semi-structured, and unstructured data, extracting information, and identifying linkages across disparate data sets

multinational retail corporation
Designation: Sr. Software Engineer – React Native
Work Location: Chennai/Bangalore
Experience:
- 4 - 8 years of experience in development with minimum 2 years in react native.
Description:
- Should have min 2+ years of Hands-on experience in React Native development.
- Knowledge of REACT tools including React.js, Webpack, Enzyme, Redux, and Flux
- Experience with JavaScript, Typescript, CSS, HTML5 and front-end languages.
- Good understanding of Android/ iOS design guidelines, SDK.
- Exposure to building React Native components in native iOS and Android.
- Solve complex technical, scalability or performance challenges.
- Familiarity with code versioning tools such as Git, SVN, Gitlab.
Educational Requirements:
B.E/B.Tech (CS/IT/ECE), MSc (CS), MCA
About Paytail:
We are a young Fintech Company providing BNPL Solutions and enabling instant digital finance and easy EMI's across millions of merchants across the country. We are on the mission of simplifying affordable buying for the consumer across segments. Paytail is backed by Cholamandlam Investments, and few large Angel Investors.
Why Paytail:
Complete Ownership: Take responsibility for whatever work you do with little or no interference, get awarded on the successes for whatever work you do with little or no interference, and be accountable for the slip-ups.
Feedback driven Culture: - Feedback is the breakfast for the champions- and thus we believe feedback helps people & organization improve.
Work with top talent: Get doors open for new learning, our team is being led by the best minds in the industry.
Open Work Culture: Everyone from the CEO is easily accessible. Have any new ideas you think can revolutionize the workplace? Or do you have some concerns about your work? Feel free to talk to the leadership.
Skills, Responsibilities & Requirements:
- Identify qualified prospects and navigate company structures to identify decision- makers.
- Use a combination of outreach mechanisms to nurture leads (Call, Email, Marketing automation tools like Outreach, LinkedIn InMail, Lusha etc.)
- Tie up leads that will turn into the business through persistence.
- Learn, leverage, and help evolve our demand generation process.
- Generate appointments/meetings by means of proactive outbound prospecting
- Work directly with sales and marketing to discover opportunities from leads.
- Previous experience working with SAAS/Software Product/Fintech Companies is plus.
Location: Gurgaon/Gurugram
Responsibilities:
- Build a high-quality web application from scratch.
- Design, build and maintain efficient, reusable, and reliable code to ensure the applications' best possible performance, quality, and responsiveness
- Work as part of a team developing web applications and services using Agile development methods.
- Contribute to team and organizational improvements in process and infrastructure.
- Code, test and operate Laravel based services.
- Lead the entire web application development life cycle right from concept stage to delivery and post launch support
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
- Document the development process, architecture, and standard components
- Developing JSON, REST, and SOAP APIs/web services.
Skills Requirements
- Strong expertise in PHP programming on Laravel with deep understanding on Laravel features and libraries
- Restful API development expertise is a must
- Must be proficient in Web development technologies like HTML 5, CSS 3, Javascript, Node-JS
- Knowledge of jQuery and HTML
- Knowledge of MySQL and Query Optimization
- Knowledge of GIT
- Good written communication skills.
- You must be humble and a team player.
- Experience in handling a team and Project Management.
- Understanding of MVC design patterns
- Hands on experience with any E-commerce website.
Good Communication skills
Strong in Adobe Experience Manager Development
Location: PAN India



