

Cymetrix Software
https://cymetrixsoft.comAbout
Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients.
Catalyzing Tangible Growth: Our pivotal role involves facilitating and driving actual growth for clients. We're committed to becoming a catalyst for dynamic transformation within the business landscape.
Niche focus, limitless growth: Cymetrix specializes in CRM, Data, and AI-powered technologies, offering tailored solutions and profound insights. This focused approach paves the way for exponential growth opportunities for clients.
A Digital Transformation Partner: Cymetrix aims to deliver the necessary support, expertise, and solutions that drive businesses to innovate with unwavering assurance. Our commitment fosters a culture of continuous improvement and growth, ensuring your innovation journey is successful.
The Cymetrix Software team is under the leadership of agile, entrepreneurial, and veteran technology experts who are devoted to augmenting the value of the solutions they are delivering.
Our certified team of 150+ consultants excels in Salesforce products. We have experience in designing and developing products and IPs on the Salesforce platform enables us to design industry-specific, customized solutions, with intuitive user interfaces.
Candid answers by the company
Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients.
Jobs at Cymetrix Software

Role Overview
We are looking for a highly skilled and intellectually curious Senior Data Scientist with 7+ years of experience in applying advanced machine learning and AI techniques to solve complex business problems. The ideal candidate will have deep expertise in Classical Machine Learning, Deep Learning, Natural Language Processing (NLP), and Generative AI (GenAI), along with strong hands-on coding skills and a proven track record of delivering impactful data science solutions. This role requires a blend of technical excellence, business acumen, and collaborative mindset.
Key Responsibilities
- Design, develop, and deploy ML models using classical algorithms (e.g., regression, decision trees, ensemble methods) and deep learning architectures (CNNs, RNNs, Transformers).
- Build NLP solutions for tasks such as text classification, entity recognition, summarization, and conversational AI.
- Develop and fine-tune GenAI models for use cases like content generation, code synthesis, and personalization.
- Architect and implement Retrieval-Augmented Generation (RAG) systems for enhanced contextual AI applications.
- Collaborate with data engineers to build scalable data pipelines and feature stores.
- Perform advanced feature engineering and selection to improve model accuracy and robustness.
- Work with large-scale structured and unstructured datasets using distributed computing frameworks.
- Translate business problems into data science solutions and communicate findings to stakeholders.
- Present insights and recommendations through compelling storytelling and visualization.
- Mentor junior data scientists and contribute to internal knowledge sharing and innovation.
Required Qualifications
- 7+ years of experience in data science, machine learning, and AI.
- Strong academic background in Computer Science, Statistics, Mathematics, or related field (Master’s or PhD preferred).
- Proficiency in Python, SQL, and ML libraries (scikit-learn, TensorFlow, PyTorch, Hugging Face).
- Experience with NLP and GenAI tools (e.g., Azure AI Foundry, Azure AI studio, GPT, LLaMA, LangChain).
- Hands-on experience with Retrieval-Augmented Generation (RAG) systems and vector databases.
- Familiarity with cloud platforms (Azure preferred, AWS/GCP acceptable) and MLOps tools (MLflow, Airflow, Kubeflow).
- Solid understanding of data structures, algorithms, and software engineering principles.
- Experience with Aure, Azure Copilot Studio, Azure Cognitive Services
- Experience with Azure AI Foundry would be a strong added advantage
Preferred Skills
- Exposure to LLM fine-tuning, prompt engineering, and GenAI safety frameworks.
- Experience in domains such as finance, healthcare, retail, or enterprise SaaS.
- Contributions to open-source projects, publications, or patents in AI/ML.
Soft Skills
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder engagement abilities.
- Ability to work independently and collaboratively in cross-functional teams.
- Passion for continuous learning and innovation.
Remote opening
min 3.5 years
What you’ll do:
You will be working as a senior software engineer within the healthcare domain, where you will focus on module level integration and collaboration across other areas of projects, helping healthcare organizations achieve their business goals with use of full stack technologies, cloud services & DevOps. You will be working with Architects from other specialties such as cloud engineering, data engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others. Focuses on module level integration and collaboration across other areas of projects
Role & Responsibilities:
We are looking for a Full Stack Developer who is motivated to combine the art of design with programming.Responsibilities will include translation of the UI/UX design wireframes to actual code that will produce visual elements of the application. You will work with the UI/UX designer and bridge the gap between graphical design and technical implementation, taking an active role on both sides and defining how the application looks as well as how it works.
• Develop new user-facing features
• Build reusable code and libraries for future use
• Ensure the technical feasibility of UI/UX designs
• Optimize application for maximum speed and scalability
• Assure that all user input is validated before submitting to back-end
• Collaborate with other team members and stakeholders
• Would be responsible to provide stable technical solutions which are robust and scalable as pe business needs
Skills expectation:
• Must have
o Frontend:
Proficient understanding of web markup, including HTML5, CSS3
Basic understanding of server-side CSS pre-processing platforms, such as LESS and SASS
Proficient understanding of client-side scripting and JavaScript frameworks, including jQuery
Good understanding of at least one of the advanced JavaScript libraries and frameworks such as AngularJS, KnockoutJS, BackboneJS, ReactJS etc.
Familiarity with one or more modern front-end frameworks such as Angular 15+, React, VueJS, Backbone.
Good understanding of asynchronous request handling, partial page updates, and AJAX.
Proficient understanding of cross-browser compatibility issues and ways to work
around them.
Experience with generic Angular testing frameworks
Experience Level
10+ years of experience in data engineering, with at least 3–5 years providing architectural guidance, leading teams, and standardizing enterprise data solutions. Must have deep expertise in Databricks, GCP, and modern data architecture patterns.
Key Responsibilities
- Provide architectural guidance and define standards for data engineering implementations.
- Lead and mentor a team of data engineers, fostering best practices in design, development, and operations.
- Own and drive improvements in performance, scalability, and reliability of data pipelines and platforms.
- Standardize data architecture patterns and reusable frameworks across multiple projects.
- Collaborate with cross-functional stakeholders (Product, Analytics, Business) to align data solutions with organizational goals.
- Design data models, schemas, and dataflows for efficient storage, querying, and analytics.
- Establish and enforce strong data governance practices, ensuring security, compliance, and data quality.
- Work closely with governance teams to implement lineage, cataloging, and access control in compliance with standards.
- Design and optimize ETL pipelines using Databricks, PySpark, and SQL.
- Ensure robust CI/CD practices are implemented for data workflows, leveraging Terraform and modern DevOps practices.
- Leverage GCP services such as Cloud Functions, Cloud Run, BigQuery, Pub/Sub, and Dataflow for building scalable solutions.
- Evaluate and adopt emerging technologies, with exposure to Gen AI and advanced analytics capabilities.
Qualifications & Skills
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Extensive hands-on experience with Databricks (Autoloader, DLT, Delta Lake, CDF) and PySpark.
- Expertise in SQL and advanced query optimization.
- Proficiency in Python for data engineering and automation tasks.
- Strong expertise with GCP services: Cloud Functions, Cloud Run, BigQuery, Pub/Sub, Dataflow, GCS.
- Deep understanding of CI/CD pipelines, infrastructure-as-code (Terraform), and DevOps practices.
- Proven ability to provide architectural guidance and lead technical teams.
- Experience designing data models, schemas, and governance frameworks.
- Knowledge of Gen AI concepts and ability to evaluate practical applications.
- Excellent communication, leadership, and stakeholder management skills.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Hybrid 3- days office
Max 27 lpa
Must-Have Skills:
5–10 years of experience in Data Engineering or Master Data Management.
5+ years hands-on experience with Informatica MDM (Multi-Domain Edition).
Strong understanding of MDM concepts: golden record, hierarchy management, trust/survivorship, data governance.
Proficient in:
Informatica MDM Hub Console, Provisioning Tool, Services Integration Framework (SIF).
ActiveVOS workflows, user exits (Java), and match/merge tuning.
SQL, PL/SQL, and data modeling for MDM.
Experience integrating MDM with upstream and downstream systems (ERP, CRM, Data Lake, etc.).
Knowledge of data quality integration using Informatica Data Quality (IDQ).
Key Responsibilities:
● Configure and implement Informatica MDM Hub, including subject area models, base
objects, landing tables, and relationships.
● Develop and fine-tune match & merge rules, trust scores, and survivorship logic for
creating golden records.
● Design and build ActiveVOS workflows for data stewardship, exception handling, and
business process approvals.
● Collaborate with data stewards and business teams to define data standards, ownership
models, and governance rules.
● Integrate data from various source systems via batch processing, REST APIs, or
message queues.
● Set up and maintain data quality checks and validations (in conjunction with
Informatica Data Quality (IDQ)) to ensure completeness, accuracy, and consistency.
● Build and customize Informatica MDM User Exits (Java), SIF APIs, and business entity
services as needed.
● Support MDM data loads, synchronization jobs, batch group configurations, and
performance tuning.
● Work with cross-functional teams to ensure alignment with overall data architecture and
governance standards.
● Participate in Agile ceremonies, sprint planning, and documentation of technical designs
and user guides.
Nice-to-Have Skills:
● Experience with Informatica EDC and Axon for metadata and governance integration.
● Exposure to cloud deployments of Informatica MDM (on GCP, Azure, or AWS).
● Familiarity with data stewardship concepts, data lineage, and compliance
frameworks (GDPR, HIPAA, etc.).
● Basic knowledge of DevOps tools for MDM deployments (e.g., Git, Jenkins)
Must have skills:
1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java
2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges
3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP
4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)
5. Data Warehouse concepts - Beginner to Intermediate level
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data
from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical
data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source
applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data
warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,
design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and
data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data
requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into
reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,
quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
Advanced SQL, data modeling skills - designing Dimensional Layer, 3NF, denormalized views & semantic layer, Expertise in GCP services
Role & Responsibilities:
● Design and implement robust semantic layers for data systems on Google Cloud Platform (GCP)
● Develop and maintain complex data models, including dimensional models, 3NF structures, and denormalized views
● Write and optimize advanced SQL queries for data extraction, transformation, and analysis
● Utilize GCP services to create scalable and efficient data architectures
● Collaborate with cross-functional teams to translate business requirements(specified in mapping sheets or Legacy
Datastage jobs) into effective data models
● Implement and maintain data warehouses and data lakes on GCP
● Design and optimize ETL/ELT processes for large-scale data integration
● Ensure data quality, consistency, and integrity across all data models and semantic layers
● Develop and maintain documentation for data models, semantic layers, and data flows
● Participate in code reviews and implement best practices for data modeling and database design
● Optimize database performance and query execution on GCP
● Provide technical guidance and mentorship to junior team members
● Stay updated with the latest trends and advancements in data modeling, GCP services, and big data technologies
● Collaborate with data scientists and analysts to enable efficient data access and analysis
● Implement data governance and security measures within the semantic layer and data model
Mumbai malad work from office
6 Days working
1 & 3 Saturday off
AWS Expertise: Minimum 2 years of experience working with AWS services like RDS, S3, EC2, and Lambda.
Roles and Responsibilities
1. Backend Development: Develop scalable and high-performance APIs and backend systems using Node.js. Write clean, modular, and reusable code following best practices. Debug, test, and optimize backend services for performance and scalability.
2. Database Management: Design and maintain relational databases using MySQL, PostgreSQL, or AWS RDS. Optimize database queries and ensure data integrity. Implement data backup and recovery plans.
3. AWS Cloud Services: Deploy, manage, and monitor applications using AWS infrastructure. Work with AWS services including RDS, S3, EC2, Lambda, API Gateway, and CloudWatch. Implement security best practices for AWS environments (IAM policies, encryption, etc.).
4. Integration and Microservices:Integrate third-party APIs and services. Develop and manage microservices architecture for modular application development.
5. Version Control and Collaboration: Use Git for code versioning and maintain repositories. Collaborate with front-end developers and project managers for end-to-end project delivery.
6. Troubleshooting and Debugging: Analyze and resolve technical issues and bugs. Provide maintenance and support for existing backend systems.
7. DevOps and CI/CD: Set up and maintain CI/CD pipelines. Automate deployment processes and ensure zero-downtime releases.
8. Agile Development:
Participate in Agile/Scrum ceremonies such as daily stand-ups, sprint planning, and retrospectives.
Deliver tasks within defined timelines while maintaining high quality.
Required Skills
Strong proficiency in Node.js and JavaScript/TypeScript.
Expertise in working with relational databases like MySQL/PostgreSQL and AWS RDS.
Proficient with AWS services including Lambda, S3, EC2, and API Gateway.
Experience with RESTful API design and GraphQL (optional).
Knowledge of containerization using Docker is a plus.
Strong problem-solving and debugging skills.
Familiarity with tools like Git, Jenkins, and Jira.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
1. Software Development Engineer - Salesforce
What we ask for
We are looking for strong engineers to build best in class systems for commercial &
wholesale banking at Bank, using Salesforce service cloud. We seek experienced
developers who bring deep understanding of salesforce development practices, patterns,
anti-patterns, governor limits, sharing & security model that will allow us to architect &
develop robust applications.
You will work closely with business, product teams to build applications which provide end
users with intuitive, clean, minimalist, easy to navigate experience
Develop systems by implementing software development principles and clean code
practices scalable, secure, highly resilient, have low latency
Should be open to work in a start-up environment and have confidence to deal with complex
issues keeping focus on solutions and project objectives as your guiding North Star
Technical Skills:
● Strong hands-on frontend development using JavaScript and LWC
● Expertise in backend development using Apex, Flows, Async Apex
● Understanding of Database concepts: SOQL, SOSL and SQL
● Hands-on experience in API integration using SOAP, REST API, graphql
● Experience with ETL tools , Data migration, and Data governance
● Experience with Apex Design Patterns, Integration Patterns and Apex testing
framework
● Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab,
bitbucket
● Should have worked with at least one programming language - Java, python, c++
and have good understanding of data structures
Preferred qualifications
● Graduate degree in engineering
● Experience developing with India stack
● Experience in fintech or banking domain
Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Key Responsibilities:
● Design, develop, and maintain scalable web applications using .NET Core, .NET
Framework, C#, and related technologies.
● Participate in all phases of the SDLC, including requirements gathering, architecture
design, coding, testing, deployment, and support.
● Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and
modern front-end technologies such as Angular, React, and JavaScript.
● Conduct thorough code reviews, write unit tests, and ensure adherence to coding
standards and best practices.
● Lead or support .NET Framework to .NET Core migration initiatives, ensuring
minimal disruption and optimal performance.
● Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or
GitLab CI/CD.
● Containerize applications using Docker and deploy/manage them on orchestration
platforms like Kubernetes or GKE.
● Lead and execute database migration projects, particularly transitioning from SQL
Server to PostgreSQL.
● Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and
ongoing maintenance.
● Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud
Run, and Dataflow to build and maintain cloud-native solutions.
● Handle schema conversion and data transformation tasks as part of migration and
modernization efforts.
Required Skills & Experience:
● 5+ years of hands-on experience with C#, .NET Core, and .NET Framework.
● Proven experience in application modernization and cloud-native development.
● Strong knowledge of containerization (Docker) and orchestration tools like
Kubernetes/GKE.
● Expertise in implementing and managing CI/CD pipelines.
● Solid understanding of relational databases and experience in SQL Server to
PostgreSQL migrations.
● Familiarity with cloud infrastructure, especially GCP services relevant to application
hosting and data processing.
● Excellent problem-solving, communication,
Similar companies
About the company
Jobs
11
About the company
Who we are
We are Software Craftspeople. We are proud of the way we work and the code we write. We embrace and are evangelists of eXtreme Programming practices. We heavily believe in being a DevOps organization, where developers own the entire release cycle and thus own quality. And most importantly, we never stop learning!
We work with product organizations to help them scale or modernize their legacy technology solutions. We work with startups to help them operationalize their idea efficiently. We work with large established institutions to help them create internal applications to automate manual opperations and achieve scale.
We design software, design the team a well as the organizational strategy required to successfully release robust and scalable products. Incubyte strives to find people who are passionate about coding, learning and growing along with us. We work with a limited number of clients at a time on dedicated, long term commitments with an aim to bringing a product mindset into services. More on our website: https://www.incubyte.co/
Join our team! We’re always looking for like minded people!
Jobs
14
About the company
Data Axle is a product company that offers various data and technology solutions, including software-as-a-service (SaaS) and data-as-a-service (DaaS). These solutions help businesses manage and leverage data for marketing, sales, and business intelligence.
They are data-driven marketing solutions provider that helps clients with clean data, lead generation, strategy development, campaign design, and day-to-day execution needs. It solves the problem of inaccurate and incomplete data, enabling businesses to make informed decisions and drive growth. Data Axle operates in various industries, including healthcare, finance, retail, and technology.
About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission
critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.
Data Axle India is recognized as a Great Place to Work!
This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.
Jobs
4
About the company
Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.
Jobs
330
About the company
Jobs
20
About the company
We simplify the business employment process by setting up the right people for apt career opportunities.
Jobs
12
About the company
We are building The first AI driven brand health measurement system in the world. We conduct real conversations with real customers at scale to understand health of the brand and provide actionable insights.
Jobs
3
About the company
Borderless Access is an award-winning insights solutions firm that blends digital-first research products with powerful analytics to connect brands with real-world consumer experiences. With a global panel of B2C, B2B and healthcare audiences and a deep industry footprint across advertising, retail, technology, mobility, healthcare and more, we enable brands, agencies and consultancies to ask the right questions—and find the right answers.
By harnessing AI, machine learning and robust quality-control practices, we deliver scalable, agile solutions that generate actionable insights and sustainable ROI. From market sizing and forecasting to concept testing and brand tracking, our customised research ecosystem gives clients the confidence to navigate consumer behaviour, market shifts and global growth opportunities.
Jobs
1
About the company
Jobs
2






