

Cymetrix Software
https://cymetrixsoft.comAbout
Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients.
Catalyzing Tangible Growth: Our pivotal role involves facilitating and driving actual growth for clients. We're committed to becoming a catalyst for dynamic transformation within the business landscape.
Niche focus, limitless growth: Cymetrix specializes in CRM, Data, and AI-powered technologies, offering tailored solutions and profound insights. This focused approach paves the way for exponential growth opportunities for clients.
A Digital Transformation Partner: Cymetrix aims to deliver the necessary support, expertise, and solutions that drive businesses to innovate with unwavering assurance. Our commitment fosters a culture of continuous improvement and growth, ensuring your innovation journey is successful.
The Cymetrix Software team is under the leadership of agile, entrepreneurial, and veteran technology experts who are devoted to augmenting the value of the solutions they are delivering.
Our certified team of 150+ consultants excels in Salesforce products. We have experience in designing and developing products and IPs on the Salesforce platform enables us to design industry-specific, customized solutions, with intuitive user interfaces.
Candid answers by the company
Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients.
Jobs at Cymetrix Software
Role: Software Development (Senior and Associate)
Experience Level: 4 to 9 Years
Work location: Remote
What you’ll do:
We are seeking a Mid-Level Node.js Developer to join our development team as an individual contributor. You will design, develop, and maintain scalable microservices for diverse client projects, working on enterprise applications that require high performance, reliability, and seamless deployment in containerized environments.
Key Responsibilities:
● Develop and maintain scalable Node.js microservices for diverse client projects
● Implement robust REST APIs with proper error handling and validation
● Write comprehensive unit and integration tests ensuring high code quality
● Design portable, efficient solutions deployable across different client environments
● Collaborate with cross-functional teams and client stakeholders
● Optimize application performance for high-concurrency scenarios
● Implement security best practices for enterprise applications
● Participate in code reviews and maintain coding standards
● Support deployment and troubleshooting in client environments
Must have skills:
Core Technical Expertise:
● Node.js: 4+ years of production experience with Node.js (ES6+, Async/Await, Promises, Event Loop understanding)
● Frameworks: Strong hands-on experience with Express.js, Fastify, or NestJS
● REST API Development: Proven experience designing and implementing RESTful web services, middleware
implementation
● JavaScript/TypeScript: Proficient in modern JavaScript (ES6+) and TypeScript for type-safe development
● Testing: Experience with testing frameworks (Jest, Mocha, Chai), unit testing, integration testing, mocking
Microservices & Deployment:
● Containerization: Hands-on Docker experience for packaging and deploying Node.js applications
● Microservices Architecture: Understanding of service decomposition, inter-service communication, event-driven
architecture
● Abstraction & Portability: Environment-agnostic design, configuration management (dotenv, config modules)
● Build Tools: NPM/Yarn for dependency management, understanding of package.json
Good to have have skills:
Advanced Technical:
● Advanced Frameworks: NestJS, Koa.js, Hapi.js
● Orchestration: Kubernetes, Docker
● Cloud Platforms: Alibaba, Azure, or GCP services and deployment
● Message Brokers: Apache Kafka, RabbitMQ for asynchronous communication
● Databases: Both SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra)
● API Gateway: Express Gateway, Kong API Gateway
Development & Operations:
● CI/CD pipelines (Jenkins, GitLab CI/CD)
● Monitoring & Observability (Winston, Morgan, Prometheus, New Relic)
● GraphQL with Apollo Server or similar
● Security best practices (Helmet.js, authentication, authorization)
Client-Facing Experience:
● Experience working in service-based organizations
● Adaptability to different domain requirements
● Understanding of various industry standards and compliance requirements
Why Join Quantiphi?
● Be part of an award-winning Google Cloud partner recognized for innovation and impact.
● Work on cutting-edge GCP-based data engineering and AI projects.
● Collaborate with a global team of data scientists, engineers, and AI experts.
● Access continuous learning, certifications, and leadership development opportunities.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Remote Opportunity
Contact Center Solution Design and Implementation
IVR Systems Development (Google CCAI, Genesys Cloud, Amazon Connect, Five9, Cisco, Avaya, or other CCaaS platforms)
API, Web Services, Messaging, CMS, Ticketing Systems, and Backend Integrations
Automatic Call Distribution (ACD), Computer Telephony Integration (CTI), Call Routing, Queues, RTI, Queue Management, Skill-Based Routing, and Escalation Paths
Live Agent Operations (Disposition Codes, AHT, Warm/Cold Transfers)
Call Flow, Agent Script, and Customer Workflow Design
SIP and Telephony Integration
Communication and Interpersonal Skills
Team Collaboration and Independent Work Abilities
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
We need Drupal Freelancer for maintaining our company website.
Initially there may be 50-100 Hours of work and after that per month some work
1000 - 1300 per hour
Responsibilities:
Frontend and Backend Development
Creating custom themes and modules
Troubleshooting Drupal website issues.
Experience
Strong understanding of Drupal
Experience with PHP, HTML, CSS, JavaScript & MySQL
Knowledge of DXPR Builder
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Job Title: BASE24 EPS Tester / Developer
Location: Bangalore
Work Type: Permanent – Full Time
Role Overview:
Seeking an experienced technical engineer to support the migration from the existing transaction switch to BASE24 EPS. The role involves development, configuration, integration, and testing of BASE24 EPS modules, ensuring smooth transition, high-quality delivery, and collaboration within cross-functional teams.
Key Responsibilities:
Develop, test, and deploy solutions including CSMs on BASE24 EPS
Configure BASE24 EPS modules, terminals, ATMs, host systems, and interfaces
Prepare migration scripts and interchange configurations (VISA, MasterCard, AMEX, JCB)
Manage scheme mandates and updates
Configure security keys and integrate EPS with HSMs
Generate MIS and required business reports
Skills & Experience:
5+ years in transaction switch systems
Strong hands-on experience in BASE24 EPS, scripting, SDK
Knowledge of ISO 8583, XML, scheme formats and certifications
Strong C++/Java programming and debugging skills
Experience with Linux, VMware; cloud knowledge (Azure/AWS) preferred
Education:
Bachelor’s or Master’s in Engineering/Science or equivalent
Relevant certifications preferred
Values:
Ethical, bold, professional
Respectful, customer-focused
Strong communication and time management skills
Location: Andheri, Mumbai
Work Mode: Hybrid
Working Days: Monday to Friday (5 Days Working)
About the Role
We are looking for a Junior Technical Recruiter / Fresher who is passionate about building a career in IT recruitment. If you are eager to learn, have strong communication skills, and want to grow in a fast-paced hiring environment, this role is for you.
Key Responsibilities
- Source and screen candidates for various IT roles using job portals, social media, and networking.
- Understand job requirements and match relevant profiles.
- Conduct initial HR screening and coordinate interviews.
- Maintain candidate database and ensure timely follow-ups.
- Support the senior recruitment team in daily hiring activities.
What We’re Looking For
- Freshers or candidates with 0–1 year experience in recruitment.
- Strong interest in IT hiring and willingness to learn new technologies.
- Good communication and interpersonal skills.
- Basic knowledge of hiring tools (Naukri, LinkedIn, etc.) is a plus.
- Immediate joiners preferred.
Why Join Us?
- Structured training and mentorship.
- Fast growth opportunities in IT recruitment.
- Friendly work culture with hybrid flexibility.
- Opportunity to work on niche and emerging tech roles.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Role Overview
We are looking for a highly skilled and intellectually curious Senior Data Scientist with 7+ years of experience in applying advanced machine learning and AI techniques to solve complex business problems. The ideal candidate will have deep expertise in Classical Machine Learning, Deep Learning, Natural Language Processing (NLP), and Generative AI (GenAI), along with strong hands-on coding skills and a proven track record of delivering impactful data science solutions. This role requires a blend of technical excellence, business acumen, and collaborative mindset.
Key Responsibilities
- Design, develop, and deploy ML models using classical algorithms (e.g., regression, decision trees, ensemble methods) and deep learning architectures (CNNs, RNNs, Transformers).
- Build NLP solutions for tasks such as text classification, entity recognition, summarization, and conversational AI.
- Develop and fine-tune GenAI models for use cases like content generation, code synthesis, and personalization.
- Architect and implement Retrieval-Augmented Generation (RAG) systems for enhanced contextual AI applications.
- Collaborate with data engineers to build scalable data pipelines and feature stores.
- Perform advanced feature engineering and selection to improve model accuracy and robustness.
- Work with large-scale structured and unstructured datasets using distributed computing frameworks.
- Translate business problems into data science solutions and communicate findings to stakeholders.
- Present insights and recommendations through compelling storytelling and visualization.
- Mentor junior data scientists and contribute to internal knowledge sharing and innovation.
Required Qualifications
- 7+ years of experience in data science, machine learning, and AI.
- Strong academic background in Computer Science, Statistics, Mathematics, or related field (Master’s or PhD preferred).
- Proficiency in Python, SQL, and ML libraries (scikit-learn, TensorFlow, PyTorch, Hugging Face).
- Experience with NLP and GenAI tools (e.g., Azure AI Foundry, Azure AI studio, GPT, LLaMA, LangChain).
- Hands-on experience with Retrieval-Augmented Generation (RAG) systems and vector databases.
- Familiarity with cloud platforms (Azure preferred, AWS/GCP acceptable) and MLOps tools (MLflow, Airflow, Kubeflow).
- Solid understanding of data structures, algorithms, and software engineering principles.
- Experience with Aure, Azure Copilot Studio, Azure Cognitive Services
- Experience with Azure AI Foundry would be a strong added advantage
Preferred Skills
- Exposure to LLM fine-tuning, prompt engineering, and GenAI safety frameworks.
- Experience in domains such as finance, healthcare, retail, or enterprise SaaS.
- Contributions to open-source projects, publications, or patents in AI/ML.
Soft Skills
- Strong analytical and problem-solving skills.
- Excellent communication and stakeholder engagement abilities.
- Ability to work independently and collaboratively in cross-functional teams.
- Passion for continuous learning and innovation.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Remote opening
min 3.5 years
What you’ll do:
You will be working as a senior software engineer within the healthcare domain, where you will focus on module level integration and collaboration across other areas of projects, helping healthcare organizations achieve their business goals with use of full stack technologies, cloud services & DevOps. You will be working with Architects from other specialties such as cloud engineering, data engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others. Focuses on module level integration and collaboration across other areas of projects
Role & Responsibilities:
We are looking for a Full Stack Developer who is motivated to combine the art of design with programming.Responsibilities will include translation of the UI/UX design wireframes to actual code that will produce visual elements of the application. You will work with the UI/UX designer and bridge the gap between graphical design and technical implementation, taking an active role on both sides and defining how the application looks as well as how it works.
• Develop new user-facing features
• Build reusable code and libraries for future use
• Ensure the technical feasibility of UI/UX designs
• Optimize application for maximum speed and scalability
• Assure that all user input is validated before submitting to back-end
• Collaborate with other team members and stakeholders
• Would be responsible to provide stable technical solutions which are robust and scalable as pe business needs
Skills expectation:
• Must have
o Frontend:
Proficient understanding of web markup, including HTML5, CSS3
Basic understanding of server-side CSS pre-processing platforms, such as LESS and SASS
Proficient understanding of client-side scripting and JavaScript frameworks, including jQuery
Good understanding of at least one of the advanced JavaScript libraries and frameworks such as AngularJS, KnockoutJS, BackboneJS, ReactJS etc.
Familiarity with one or more modern front-end frameworks such as Angular 15+, React, VueJS, Backbone.
Good understanding of asynchronous request handling, partial page updates, and AJAX.
Proficient understanding of cross-browser compatibility issues and ways to work
around them.
Experience with generic Angular testing frameworks
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Experience Level
10+ years of experience in data engineering, with at least 3–5 years providing architectural guidance, leading teams, and standardizing enterprise data solutions. Must have deep expertise in Databricks, GCP, and modern data architecture patterns.
Key Responsibilities
- Provide architectural guidance and define standards for data engineering implementations.
- Lead and mentor a team of data engineers, fostering best practices in design, development, and operations.
- Own and drive improvements in performance, scalability, and reliability of data pipelines and platforms.
- Standardize data architecture patterns and reusable frameworks across multiple projects.
- Collaborate with cross-functional stakeholders (Product, Analytics, Business) to align data solutions with organizational goals.
- Design data models, schemas, and dataflows for efficient storage, querying, and analytics.
- Establish and enforce strong data governance practices, ensuring security, compliance, and data quality.
- Work closely with governance teams to implement lineage, cataloging, and access control in compliance with standards.
- Design and optimize ETL pipelines using Databricks, PySpark, and SQL.
- Ensure robust CI/CD practices are implemented for data workflows, leveraging Terraform and modern DevOps practices.
- Leverage GCP services such as Cloud Functions, Cloud Run, BigQuery, Pub/Sub, and Dataflow for building scalable solutions.
- Evaluate and adopt emerging technologies, with exposure to Gen AI and advanced analytics capabilities.
Qualifications & Skills
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Extensive hands-on experience with Databricks (Autoloader, DLT, Delta Lake, CDF) and PySpark.
- Expertise in SQL and advanced query optimization.
- Proficiency in Python for data engineering and automation tasks.
- Strong expertise with GCP services: Cloud Functions, Cloud Run, BigQuery, Pub/Sub, Dataflow, GCS.
- Deep understanding of CI/CD pipelines, infrastructure-as-code (Terraform), and DevOps practices.
- Proven ability to provide architectural guidance and lead technical teams.
- Experience designing data models, schemas, and governance frameworks.
- Knowledge of Gen AI concepts and ability to evaluate practical applications.
- Excellent communication, leadership, and stakeholder management skills.
The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Hybrid 3- days office
Max 27 lpa
Must-Have Skills:
5–10 years of experience in Data Engineering or Master Data Management.
5+ years hands-on experience with Informatica MDM (Multi-Domain Edition).
Strong understanding of MDM concepts: golden record, hierarchy management, trust/survivorship, data governance.
Proficient in:
Informatica MDM Hub Console, Provisioning Tool, Services Integration Framework (SIF).
ActiveVOS workflows, user exits (Java), and match/merge tuning.
SQL, PL/SQL, and data modeling for MDM.
Experience integrating MDM with upstream and downstream systems (ERP, CRM, Data Lake, etc.).
Knowledge of data quality integration using Informatica Data Quality (IDQ).
Key Responsibilities:
● Configure and implement Informatica MDM Hub, including subject area models, base
objects, landing tables, and relationships.
● Develop and fine-tune match & merge rules, trust scores, and survivorship logic for
creating golden records.
● Design and build ActiveVOS workflows for data stewardship, exception handling, and
business process approvals.
● Collaborate with data stewards and business teams to define data standards, ownership
models, and governance rules.
● Integrate data from various source systems via batch processing, REST APIs, or
message queues.
● Set up and maintain data quality checks and validations (in conjunction with
Informatica Data Quality (IDQ)) to ensure completeness, accuracy, and consistency.
● Build and customize Informatica MDM User Exits (Java), SIF APIs, and business entity
services as needed.
● Support MDM data loads, synchronization jobs, batch group configurations, and
performance tuning.
● Work with cross-functional teams to ensure alignment with overall data architecture and
governance standards.
● Participate in Agile ceremonies, sprint planning, and documentation of technical designs
and user guides.
Nice-to-Have Skills:
● Experience with Informatica EDC and Axon for metadata and governance integration.
● Exposure to cloud deployments of Informatica MDM (on GCP, Azure, or AWS).
● Familiarity with data stewardship concepts, data lineage, and compliance
frameworks (GDPR, HIPAA, etc.).
● Basic knowledge of DevOps tools for MDM deployments (e.g., Git, Jenkins)
Must have skills:
1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java
2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges
3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP
4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)
5. Data Warehouse concepts - Beginner to Intermediate level
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data
from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical
data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source
applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data
warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,
design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and
data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data
requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into
reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,
quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
Similar companies
About the company
OpsTree Global is a digital transformation and platform engineering partner that helps organizations build scalable, secure, and high-impact technology foundations. With expertise across cloud modernization, Data & AI, Observability & SRE, DevSecOps, security, quality engineering, and end-to-end software delivery, OpsTree enables faster, outcome-driven digital transformation.
As an AWS Advanced Tier Services Partner and App Modernization specialist, OpsTree blends cloud-native practices with AI-driven innovation to deliver resilient, high-performing platforms. Its in-house DevSecOps platform, BuildPiper, helps enterprises standardize and accelerate software delivery at scale.
Trusted by 250+ organizations—from startups to Fortune 100 enterprises—OpsTree is known for making software delivery lean, nimble, and highly productive. Driven by a culture of continuous learning, strong ethics, and thought leadership, OpsTree fosters a transparent and growth-oriented environment that empowers teams to build the next generation of cloud-native solutions.
Jobs
2
About the company
At Deqode, our purpose is to help businesses solve complex problems using new-age technologies. We provide enterprise blockchain solutions to businesses.
Jobs
203
About the company
Founded in 2011, CallHub is an easy to use and award winning voice broadcasting, phone banking and SMS broadcast software that seamlessly integrates with users' existing systems. They are used across 200 countries and world renowned campaigns run by Uber, Greenpeace, Harvard University, Princeton University, Change.org, Save the Children, United Workers Union, among many others.
Check out the video below for a sneak peek into the product!
CallHub, headquartered in Washington DC, is the leading digital organizing platform for political campaigns, nonprofits, advocacy groups, and businesses seamlessly interact with their audiences worldwide.
The product
Their call center software enables unlimited agents to connect with people over a call to have personalized one-on-one conversations. The range of automated dialers (which include Power, Preview, Predictive, and Fast Click dialers) gives clients complete control over their dialing speed based on your needs while maintaining compliance. With features like answering machine detection, voicemail drop, phone number verification, and many more, the software ensures you spend maximum time having quality conversations. CallHub's mass texting solution lets users connect with people over SMS with solutions like Bulk SMS, MMS, SMS Opt-in and peer-to-peer texting.
Jobs
1
About the company
Jobs
3
About the company
A 20 Years old trusted EdTech, Jupsoft Technologies Pvt Ltd having 1000+ clients across India, offering solutions on AWS hosted application including (LMS + ERP + LMS + Digital Content) for Schools, colleges and Coaching Centers that automates all processes of all academic & non-academic operations and ensuring transparency across all departments. Best EdTech Solution for better communication & ROI.
Jobs
6
About the company
Jobs
31
About the company
Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.
Jobs
336
About the company
Jobs
9
About the company
Jobs
4







