Cutshort logo
Cymetrix Software
Cymetrix Software cover picture
Cymetrix Software logo

Cymetrix Software

https://cymetrixsoft.com/
Founded :
2016
Type :
Services
Size :
100-1000
Stage :
Profitable

About

Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients. 


Catalyzing Tangible Growth: Our pivotal role involves facilitating and driving actual growth for clients. We're committed to becoming a catalyst for dynamic transformation within the business landscape.


Niche focus, limitless growth: Cymetrix specializes in CRM, Data, and AI-powered technologies, offering tailored solutions and profound insights. This focused approach paves the way for exponential growth opportunities for clients.


A Digital Transformation Partner: Cymetrix aims to deliver the necessary support, expertise, and solutions that drive businesses to innovate with unwavering assurance. Our commitment fosters a culture of continuous improvement and growth, ensuring your innovation journey is successful.


The Cymetrix Software team is under the leadership of agile, entrepreneurial, and veteran technology experts who are devoted to augmenting the value of the solutions they are delivering.


Our certified team of 150+ consultants excels in Salesforce products. We have experience in designing and developing products and IPs on the Salesforce platform enables us to design industry-specific, customized solutions, with intuitive user interfaces.


Read more

Candid answers by the company

What does the company do?
What is the location preference of jobs?

Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients. 

Company social profiles

bloglinkedintwitterfacebook

Jobs at Cymetrix Software

Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
4 - 7 yrs
₹10L - ₹21L / yr
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconJavascript
skill iconHTML/CSS

Remote opening

min 3.5 years

What you’ll do:

You will be working as a senior software engineer within the healthcare domain, where you will focus on module level integration and collaboration across other areas of projects, helping healthcare organizations achieve their business goals with use of full stack technologies, cloud services & DevOps. You will be working with Architects from other specialties such as cloud engineering, data engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others. Focuses on module level integration and collaboration across other areas of projects


Role & Responsibilities:

We are looking for a Full Stack Developer who is motivated to combine the art of design with programming.Responsibilities will include translation of the UI/UX design wireframes to actual code that will produce visual elements of the application. You will work with the UI/UX designer and bridge the gap between graphical design and technical implementation, taking an active role on both sides and defining how the application looks as well as how it works.

• Develop new user-facing features

• Build reusable code and libraries for future use

• Ensure the technical feasibility of UI/UX designs

• Optimize application for maximum speed and scalability

• Assure that all user input is validated before submitting to back-end

• Collaborate with other team members and stakeholders

• Would be responsible to provide stable technical solutions which are robust and scalable as pe business needs


Skills expectation:

• Must have

o Frontend:


 Proficient understanding of web markup, including HTML5, CSS3

 Basic understanding of server-side CSS pre-processing platforms, such as LESS and SASS

 Proficient understanding of client-side scripting and JavaScript frameworks, including jQuery

 Good understanding of at least one of the advanced JavaScript libraries and frameworks such as AngularJS, KnockoutJS, BackboneJS, ReactJS etc.

 Familiarity with one or more modern front-end frameworks such as Angular 15+, React, VueJS, Backbone.

 Good understanding of asynchronous request handling, partial page updates, and AJAX.

 Proficient understanding of cross-browser compatibility issues and ways to work

around them.

 Experience with generic Angular testing frameworks

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
9 - 15 yrs
₹15L - ₹36L / yr
Google Cloud Platform (GCP)
databricks
Architecture
bigquery
Google Cloud Storage
+2 more

Experience Level

10+ years of experience in data engineering, with at least 3–5 years providing architectural guidance, leading teams, and standardizing enterprise data solutions. Must have deep expertise in Databricks, GCP, and modern data architecture patterns.


Key Responsibilities

- Provide architectural guidance and define standards for data engineering implementations.

- Lead and mentor a team of data engineers, fostering best practices in design, development, and operations.

- Own and drive improvements in performance, scalability, and reliability of data pipelines and platforms.

- Standardize data architecture patterns and reusable frameworks across multiple projects.

- Collaborate with cross-functional stakeholders (Product, Analytics, Business) to align data solutions with organizational goals.

- Design data models, schemas, and dataflows for efficient storage, querying, and analytics.

- Establish and enforce strong data governance practices, ensuring security, compliance, and data quality.

- Work closely with governance teams to implement lineage, cataloging, and access control in compliance with standards.

- Design and optimize ETL pipelines using Databricks, PySpark, and SQL.

- Ensure robust CI/CD practices are implemented for data workflows, leveraging Terraform and modern DevOps practices.

- Leverage GCP services such as Cloud Functions, Cloud Run, BigQuery, Pub/Sub, and Dataflow for building scalable solutions.

- Evaluate and adopt emerging technologies, with exposure to Gen AI and advanced analytics capabilities.


Qualifications & Skills

- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.

- Extensive hands-on experience with Databricks (Autoloader, DLT, Delta Lake, CDF) and PySpark.

- Expertise in SQL and advanced query optimization.

- Proficiency in Python for data engineering and automation tasks.

- Strong expertise with GCP services: Cloud Functions, Cloud Run, BigQuery, Pub/Sub, Dataflow, GCS.

- Deep understanding of CI/CD pipelines, infrastructure-as-code (Terraform), and DevOps practices.

- Proven ability to provide architectural guidance and lead technical teams.

- Experience designing data models, schemas, and governance frameworks.

- Knowledge of Gen AI concepts and ability to evaluate practical applications.

- Excellent communication, leadership, and stakeholder management skills.



Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹27L / yr
MDM
Informatica MDM
Service integration framework
datalake
Informatica Data Quality

Hybrid 3- days office

Max 27 lpa


Must-Have Skills:

5–10 years of experience in Data Engineering or Master Data Management.

5+ years hands-on experience with Informatica MDM (Multi-Domain Edition).

Strong understanding of MDM concepts: golden record, hierarchy management, trust/survivorship, data governance.

Proficient in:

Informatica MDM Hub Console, Provisioning Tool, Services Integration Framework (SIF).

ActiveVOS workflows, user exits (Java), and match/merge tuning.

SQL, PL/SQL, and data modeling for MDM.

Experience integrating MDM with upstream and downstream systems (ERP, CRM, Data Lake, etc.).

Knowledge of data quality integration using Informatica Data Quality (IDQ).


Key Responsibilities:

● Configure and implement Informatica MDM Hub, including subject area models, base

objects, landing tables, and relationships.

● Develop and fine-tune match & merge rules, trust scores, and survivorship logic for

creating golden records.

● Design and build ActiveVOS workflows for data stewardship, exception handling, and

business process approvals.

● Collaborate with data stewards and business teams to define data standards, ownership

models, and governance rules.

● Integrate data from various source systems via batch processing, REST APIs, or

message queues.

● Set up and maintain data quality checks and validations (in conjunction with

Informatica Data Quality (IDQ)) to ensure completeness, accuracy, and consistency.

● Build and customize Informatica MDM User Exits (Java), SIF APIs, and business entity

services as needed.


● Support MDM data loads, synchronization jobs, batch group configurations, and

performance tuning.

● Work with cross-functional teams to ensure alignment with overall data architecture and

governance standards.

● Participate in Agile ceremonies, sprint planning, and documentation of technical designs

and user guides.



Nice-to-Have Skills:

● Experience with Informatica EDC and Axon for metadata and governance integration.

● Exposure to cloud deployments of Informatica MDM (on GCP, Azure, or AWS).


● Familiarity with data stewardship concepts, data lineage, and compliance

frameworks (GDPR, HIPAA, etc.).

● Basic knowledge of DevOps tools for MDM deployments (e.g., Git, Jenkins)

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
3 - 7 yrs
₹8L - ₹20L / yr
Google Cloud Platform (GCP)
ETL
skill iconPython
Big Data
SQL
+4 more

Must have skills:

1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java

2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)

5. Data Warehouse concepts - Beginner to Intermediate level


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data

from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical

data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source

applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data

warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,

design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and

data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data

requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into

reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,

quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
4 - 8 yrs
₹12L - ₹20L / yr
Data modeling
Dimensional modeling
Google Cloud Platform (GCP)

Advanced SQL, data modeling skills - designing Dimensional Layer, 3NF, denormalized views & semantic layer, Expertise in GCP services



Role & Responsibilities:

● Design and implement robust semantic layers for data systems on Google Cloud Platform (GCP)

● Develop and maintain complex data models, including dimensional models, 3NF structures, and denormalized views

● Write and optimize advanced SQL queries for data extraction, transformation, and analysis

● Utilize GCP services to create scalable and efficient data architectures

● Collaborate with cross-functional teams to translate business requirements(specified in mapping sheets or Legacy

Datastage jobs) into effective data models

● Implement and maintain data warehouses and data lakes on GCP

● Design and optimize ETL/ELT processes for large-scale data integration

● Ensure data quality, consistency, and integrity across all data models and semantic layers

● Develop and maintain documentation for data models, semantic layers, and data flows

● Participate in code reviews and implement best practices for data modeling and database design

● Optimize database performance and query execution on GCP

● Provide technical guidance and mentorship to junior team members

● Stay updated with the latest trends and advancements in data modeling, GCP services, and big data technologies

● Collaborate with data scientists and analysts to enable efficient data access and analysis

● Implement data governance and security measures within the semantic layer and data model

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Mumbai
2 - 12 yrs
₹1L - ₹18L / yr
skill iconAmazon Web Services (AWS)
AWS RDS
Amazon S3
Amazon EC2
AWS Lambda
+5 more

Mumbai malad work from office

6 Days working

1 & 3 Saturday off


AWS Expertise: Minimum 2 years of experience working with AWS services like RDS, S3, EC2, and Lambda.


Roles and Responsibilities

1. Backend Development: Develop scalable and high-performance APIs and backend systems using Node.js. Write clean, modular, and reusable code following best practices. Debug, test, and optimize backend services for performance and scalability.

2. Database Management: Design and maintain relational databases using MySQL, PostgreSQL, or AWS RDS. Optimize database queries and ensure data integrity. Implement data backup and recovery plans.

3. AWS Cloud Services: Deploy, manage, and monitor applications using AWS infrastructure. Work with AWS services including RDS, S3, EC2, Lambda, API Gateway, and CloudWatch. Implement security best practices for AWS environments (IAM policies, encryption, etc.).

4. Integration and Microservices:Integrate third-party APIs and services. Develop and manage microservices architecture for modular application development.

5. Version Control and Collaboration: Use Git for code versioning and maintain repositories. Collaborate with front-end developers and project managers for end-to-end project delivery.

6. Troubleshooting and Debugging: Analyze and resolve technical issues and bugs. Provide maintenance and support for existing backend systems.

7. DevOps and CI/CD: Set up and maintain CI/CD pipelines. Automate deployment processes and ensure zero-downtime releases.




8. Agile Development:

Participate in Agile/Scrum ceremonies such as daily stand-ups, sprint planning, and retrospectives.

Deliver tasks within defined timelines while maintaining high quality.


Required Skills

Strong proficiency in Node.js and JavaScript/TypeScript.

Expertise in working with relational databases like MySQL/PostgreSQL and AWS RDS.

Proficient with AWS services including Lambda, S3, EC2, and API Gateway.

Experience with RESTful API design and GraphQL (optional).

Knowledge of containerization using Docker is a plus.

Strong problem-solving and debugging skills.

Familiarity with tools like Git, Jenkins, and Jira.


Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore)
3 - 8 yrs
₹9L - ₹15L / yr
Salesforce development
Oracle Application Express (APEX)
Salesforce Lightning
SQL
ETL
+6 more

1. Software Development Engineer - Salesforce

What we ask for

We are looking for strong engineers to build best in class systems for commercial &

wholesale banking at Bank, using Salesforce service cloud. We seek experienced

developers who bring deep understanding of salesforce development practices, patterns,

anti-patterns, governor limits, sharing & security model that will allow us to architect &

develop robust applications.

You will work closely with business, product teams to build applications which provide end

users with intuitive, clean, minimalist, easy to navigate experience

Develop systems by implementing software development principles and clean code

practices scalable, secure, highly resilient, have low latency

Should be open to work in a start-up environment and have confidence to deal with complex

issues keeping focus on solutions and project objectives as your guiding North Star


Technical Skills:

● Strong hands-on frontend development using JavaScript and LWC

● Expertise in backend development using Apex, Flows, Async Apex

● Understanding of Database concepts: SOQL, SOSL and SQL

● Hands-on experience in API integration using SOAP, REST API, graphql

● Experience with ETL tools , Data migration, and Data governance

● Experience with Apex Design Patterns, Integration Patterns and Apex testing

framework

● Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab,

bitbucket

● Should have worked with at least one programming language - Java, python, c++

and have good understanding of data structures


Preferred qualifications

● Graduate degree in engineering

● Experience developing with India stack

● Experience in fintech or banking domain

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Noida, Bengaluru (Bangalore), Pune
6 - 9 yrs
₹10L - ₹18L / yr
Windows Azure
SQL Azure
SQL
Data Warehouse (DWH)
skill iconData Analytics
+3 more

Hybrid work mode


(Azure) EDW Experience working in loading Star schema data warehouses using framework

architectures including experience loading type 2 dimensions. Ingesting data from various

sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.

Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),

Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Mumbai
4 - 8 yrs
₹8L - ₹16L / yr
skill iconC#
skill icon.NET
.NET Compact Framework
SQL
Microsoft Windows Azure
+4 more

Key Responsibilities:

● Design, develop, and maintain scalable web applications using .NET Core, .NET

Framework, C#, and related technologies.

● Participate in all phases of the SDLC, including requirements gathering, architecture

design, coding, testing, deployment, and support.

● Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and

modern front-end technologies such as Angular, React, and JavaScript.

● Conduct thorough code reviews, write unit tests, and ensure adherence to coding

standards and best practices.

● Lead or support .NET Framework to .NET Core migration initiatives, ensuring

minimal disruption and optimal performance.

● Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or

GitLab CI/CD.

● Containerize applications using Docker and deploy/manage them on orchestration

platforms like Kubernetes or GKE.

● Lead and execute database migration projects, particularly transitioning from SQL

Server to PostgreSQL.

● Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and

ongoing maintenance.

● Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud

Run, and Dataflow to build and maintain cloud-native solutions.

● Handle schema conversion and data transformation tasks as part of migration and

modernization efforts.


Required Skills & Experience:

● 5+ years of hands-on experience with C#, .NET Core, and .NET Framework.

● Proven experience in application modernization and cloud-native development.


● Strong knowledge of containerization (Docker) and orchestration tools like

Kubernetes/GKE.

● Expertise in implementing and managing CI/CD pipelines.

● Solid understanding of relational databases and experience in SQL Server to

PostgreSQL migrations.

● Familiarity with cloud infrastructure, especially GCP services relevant to application

hosting and data processing.

● Excellent problem-solving, communication,

Read more
Cymetrix Software
at Cymetrix Software
2 candid answers
Netra Shettigar
Posted by Netra Shettigar
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Bengaluru (Bangalore), Chennai
5 - 8 yrs
₹10L - ₹28L / yr
Data modeling
OLAP
OLTP
bigquery
Google Cloud Platform (GCP)

Bangalore / Chennai

  • Hands-on data modelling for OLTP and OLAP systems
  • In-depth knowledge of Conceptual, Logical and Physical data modelling
  • Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same
  • Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.
  • Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin
  • Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery.
  • People with functional knowledge of the mutual fund industry will be a plus


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.


Required Skills:

● Bachelor’s degree in Computer Science or similar field or equivalent work experience.

● 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects.

● Expert with data warehousing concepts, strategies, and tools.

● Strong SQL background.

● Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL.

● Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS

● Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS).

● Knowledge of AWS and Azure Cloud is a plus.

● Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources.

● Experience in integration using APIs, XML, JSONs etc.

Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

Wissen Technology cover picture
Wissen Technology's logo

Wissen Technology

https://www.wissen.com
Founded
2000
Type
Products & Services
Size
1000-5000
Stage
Profitable

About the company

The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains.

With offices in US, India, UK, Australia, Mexico, and Canada, we offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.


Leveraging our multi-site operations in the USA and India and availability of world-class infrastructure, we offer a combination of on-site, off-site and offshore service models. Our technical competencies, proactive management approach, proven methodologies, committed support and the ability to quickly react to urgent needs make us a valued partner for any kind of Digital Enablement Services, Managed Services, or Business Services.


We believe that the technology and thought leadership that we command in the industry is the direct result of the kind of people we have been able to attract, to form this organization (you are one of them!).


Our workforce consists of 1000+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like MIT, Wharton, IITs, IIMs, and BITS and with rich work experience in some of the biggest companies in the world.


Wissen Technology has been certified as a Great Place to Work®. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients.

Jobs

391

Founded
2023
Type
Services
Size
20-100
Stage
Profitable

About the company

BTW Financial Services & IMF offers a variety of insurance-related services in addition to marketing financial products and acting as a comprehensive financial advisor. Our commitment to exceptional communication practices ensures that we not only meet but exceed the expectations of those who entrust us with their financial services. ARN Number 282372 || IRDA Number IMF0749012024069

Jobs

1

Mayura Consultancy Services cover picture
Mayura Consultancy Services's logo

Mayura Consultancy Services

https://www.mayuraconsultancy.com
Founded
2015
Type
Products & Services
Size
20-100
Stage
Bootstrapped

About the company

Jobs

6

Drivado Transfers Private Limited cover picture
Drivado Transfers Private Limited's logo

Drivado Transfers Private Limited

https://www.drivado.com/
Founded
2016
Type
Product
Size
20-100
Stage
Profitable

About the company

Jobs

0

Springer Capital cover picture
Springer Capital's logo

Springer Capital

https://www.springer.capital/
Founded
2015
Type
Products & Services
Size
100-1000
Stage
Profitable

About the company

Jobs

64

Founded
Type
Size
Stage

About the company

Jobs

5

ZestFindz Private Limited cover picture
ZestFindz Private Limited's logo

ZestFindz Private Limited

https://zestfindz.com/
Founded
2025
Type
Products & Services
Size
0-20
Stage
Bootstrapped

About the company

ZestFindz Private Limited is a Hyderabad-based startup founded in February 2025.

We simplify online retail by offering a curated marketplace for everyday essentials, fashion, home goods, skincare, and more backed by powerful seller tools. Our goal: make selling and shopping seamless with solid tech, transparent operations and customer-first design.

Jobs

2

httpsiamxsolutions cover picture
httpsiamxsolutions's logo

httpsiamxsolutions

https://iamx.solutions/
Founded
2020
Type
Products & Services
Size
0-20
Stage
Bootstrapped

About the company

Jobs

1

Founded
2008
Type
Products & Services
Size
20-100
Stage
Profitable

About the company

Jobs

3

NEXUS SP Solutions cover picture
NEXUS SP Solutions's logo

NEXUS SP Solutions

https://nexusspsolutions.com
Founded
2025
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

NEXUS SP Solutions (Trade name) is an Information and Communications Technology (ICT) company that provides engineering and contracting services for telecommunications and energy projects, installation execution, IT and cybersecurity services, and e-commerce for hardware and electronics. Its nationwide coverage is achieved through a hybrid (on-site and remote) model and its own and third-party logistics network.

Jobs

3

Want to work at Cymetrix Software?
Cymetrix Software's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs