

Cymetrix Software
https://cymetrixsoft.com/About
Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients.
Catalyzing Tangible Growth: Our pivotal role involves facilitating and driving actual growth for clients. We're committed to becoming a catalyst for dynamic transformation within the business landscape.
Niche focus, limitless growth: Cymetrix specializes in CRM, Data, and AI-powered technologies, offering tailored solutions and profound insights. This focused approach paves the way for exponential growth opportunities for clients.
A Digital Transformation Partner: Cymetrix aims to deliver the necessary support, expertise, and solutions that drive businesses to innovate with unwavering assurance. Our commitment fosters a culture of continuous improvement and growth, ensuring your innovation journey is successful.
The Cymetrix Software team is under the leadership of agile, entrepreneurial, and veteran technology experts who are devoted to augmenting the value of the solutions they are delivering.
Our certified team of 150+ consultants excels in Salesforce products. We have experience in designing and developing products and IPs on the Salesforce platform enables us to design industry-specific, customized solutions, with intuitive user interfaces.
Candid answers by the company
Cymetrix is a global CRM and Data Analytics consulting company. It has expertise across industries such as manufacturing, retail, BFSI, NPS, Pharma, and Healthcare. It has successfully implemented CRM and related business process integrations for more than 50+ clients.
Jobs at Cymetrix Software
Mumbai malad work from office
6 Days working
1 & 3 Saturday off
AWS Expertise: Minimum 2 years of experience working with AWS services like RDS, S3, EC2, and Lambda.
Roles and Responsibilities
1. Backend Development: Develop scalable and high-performance APIs and backend systems using Node.js. Write clean, modular, and reusable code following best practices. Debug, test, and optimize backend services for performance and scalability.
2. Database Management: Design and maintain relational databases using MySQL, PostgreSQL, or AWS RDS. Optimize database queries and ensure data integrity. Implement data backup and recovery plans.
3. AWS Cloud Services: Deploy, manage, and monitor applications using AWS infrastructure. Work with AWS services including RDS, S3, EC2, Lambda, API Gateway, and CloudWatch. Implement security best practices for AWS environments (IAM policies, encryption, etc.).
4. Integration and Microservices:Integrate third-party APIs and services. Develop and manage microservices architecture for modular application development.
5. Version Control and Collaboration: Use Git for code versioning and maintain repositories. Collaborate with front-end developers and project managers for end-to-end project delivery.
6. Troubleshooting and Debugging: Analyze and resolve technical issues and bugs. Provide maintenance and support for existing backend systems.
7. DevOps and CI/CD: Set up and maintain CI/CD pipelines. Automate deployment processes and ensure zero-downtime releases.
8. Agile Development:
Participate in Agile/Scrum ceremonies such as daily stand-ups, sprint planning, and retrospectives.
Deliver tasks within defined timelines while maintaining high quality.
Required Skills
Strong proficiency in Node.js and JavaScript/TypeScript.
Expertise in working with relational databases like MySQL/PostgreSQL and AWS RDS.
Proficient with AWS services including Lambda, S3, EC2, and API Gateway.
Experience with RESTful API design and GraphQL (optional).
Knowledge of containerization using Docker is a plus.
Strong problem-solving and debugging skills.
Familiarity with tools like Git, Jenkins, and Jira.
1. Software Development Engineer - Salesforce
What we ask for
We are looking for strong engineers to build best in class systems for commercial &
wholesale banking at Bank, using Salesforce service cloud. We seek experienced
developers who bring deep understanding of salesforce development practices, patterns,
anti-patterns, governor limits, sharing & security model that will allow us to architect &
develop robust applications.
You will work closely with business, product teams to build applications which provide end
users with intuitive, clean, minimalist, easy to navigate experience
Develop systems by implementing software development principles and clean code
practices scalable, secure, highly resilient, have low latency
Should be open to work in a start-up environment and have confidence to deal with complex
issues keeping focus on solutions and project objectives as your guiding North Star
Technical Skills:
● Strong hands-on frontend development using JavaScript and LWC
● Expertise in backend development using Apex, Flows, Async Apex
● Understanding of Database concepts: SOQL, SOSL and SQL
● Hands-on experience in API integration using SOAP, REST API, graphql
● Experience with ETL tools , Data migration, and Data governance
● Experience with Apex Design Patterns, Integration Patterns and Apex testing
framework
● Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab,
bitbucket
● Should have worked with at least one programming language - Java, python, c++
and have good understanding of data structures
Preferred qualifications
● Graduate degree in engineering
● Experience developing with India stack
● Experience in fintech or banking domain

Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.


Key Responsibilities:
● Design, develop, and maintain scalable web applications using .NET Core, .NET
Framework, C#, and related technologies.
● Participate in all phases of the SDLC, including requirements gathering, architecture
design, coding, testing, deployment, and support.
● Build and integrate RESTful APIs, and work with SQL Server, Entity Framework, and
modern front-end technologies such as Angular, React, and JavaScript.
● Conduct thorough code reviews, write unit tests, and ensure adherence to coding
standards and best practices.
● Lead or support .NET Framework to .NET Core migration initiatives, ensuring
minimal disruption and optimal performance.
● Implement and manage CI/CD pipelines using tools like Azure DevOps, Jenkins, or
GitLab CI/CD.
● Containerize applications using Docker and deploy/manage them on orchestration
platforms like Kubernetes or GKE.
● Lead and execute database migration projects, particularly transitioning from SQL
Server to PostgreSQL.
● Manage and optimize Cloud SQL for PostgreSQL, including configuration, tuning, and
ongoing maintenance.
● Leverage Google Cloud Platform (GCP) services such as GKE, Cloud SQL, Cloud
Run, and Dataflow to build and maintain cloud-native solutions.
● Handle schema conversion and data transformation tasks as part of migration and
modernization efforts.
Required Skills & Experience:
● 5+ years of hands-on experience with C#, .NET Core, and .NET Framework.
● Proven experience in application modernization and cloud-native development.
● Strong knowledge of containerization (Docker) and orchestration tools like
Kubernetes/GKE.
● Expertise in implementing and managing CI/CD pipelines.
● Solid understanding of relational databases and experience in SQL Server to
PostgreSQL migrations.
● Familiarity with cloud infrastructure, especially GCP services relevant to application
hosting and data processing.
● Excellent problem-solving, communication,

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Bangalore / Chennai
- Hands-on data modelling for OLTP and OLAP systems
- In-depth knowledge of Conceptual, Logical and Physical data modelling
- Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same
- Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.
- Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin
- Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery.
- People with functional knowledge of the mutual fund industry will be a plus
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
Required Skills:
● Bachelor’s degree in Computer Science or similar field or equivalent work experience.
● 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects.
● Expert with data warehousing concepts, strategies, and tools.
● Strong SQL background.
● Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL.
● Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS
● Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS).
● Knowledge of AWS and Azure Cloud is a plus.
● Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources.
● Experience in integration using APIs, XML, JSONs etc.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, BQ optimization, Airflow/Composer, Python(preferred)/Java
2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges
3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP
4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At Least 2 databases)
5. Data Warehouse concepts - Beginner to Intermediate level
6.Data Modeling, GCP Databases, DB Schema(or similar)
7.Hands-on data modelling for OLTP and OLAP systems
8.In-depth knowledge of Conceptual, Logical and Physical data modelling
9.Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same
10.Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.
11.Should have working experience on at least one data modelling tool,
preferably DBSchema, Erwin
12Good understanding of GCP databases like AlloyDB, CloudSQL, and
BigQuery.
13.People with functional knowledge of the mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.


Full Stack Developer
Location: Hyderabad
Experience: 7+ Years
Type: BCS - Business Consulting Services
RESPONSIBILITIES:
* Strong programming skills in Node JS [ Must] , React JS, Android and Kotlin [Must]
* Hands on Experience in UI development with good UX sense understanding.
• Hands on Experience in Database design and management
• Hands on Experience to create and maintain backend-framework for mobile applications.
• Hands-on development experience on cloud-based platforms like GCP/Azure/AWS
• Ability to manage and provide technical guidance to the team.
• Strong experience in designing APIs using RAML, Swagger, etc.
• Service Definition Development.
• API Standards, Security, Policies Definition and Management.
REQUIRED EXPERIENCE:
* Bachelor’s and/or master's degree in computer science or equivalent work experience
* Excellent analytical, problem solving, and communication skills.
* 7+ years of software engineering experience in a multi-national company
* 6+ years of development experience in Kotlin, Node and React JS
* 3+ Year(s) experience creating solutions in native public cloud (GCP, AWS or Azure)
* Experience with Git or similar version control system, continuous integration
* Proficiency in automated unit test development practices and design methodologies
* Fluent English

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.
Proficient in Looker Action, Looker Dashboarding, Looker Data Entry, LookML, SQL Queries, BigQuery, LookML, Looker Studio, BigQuery, GCP.
Remote Working
2 pm to 12 am IST or
10:30 AM to 7:30 PM IST
Sunday to Thursday
Responsibilities:
● Create and maintain LookML code, which defines data models, dimensions, measures, and relationships within Looker.
● Develop reusable LookML components to ensure consistency and efficiency in report and dashboard creation.
● Build and customize dashboard to Incorporate data visualizations, such as charts and graphs, to present insights effectively.
● Write complex SQL queries when necessary to extract and manipulate data from underlying databases and also optimize SQL queries for performance.
● Connect Looker to various data sources, including databases, data warehouses, and external APIs.
● Identify and address bottlenecks that affect report and dashboard loading times and Optimize Looker performance by tuning queries, caching strategies, and exploring indexing options.
● Configure user roles and permissions within Looker to control access to sensitive data & Implement data security best practices, including row-level and field-level security.
● Develop custom applications or scripts that interact with Looker's API for automation and integration with other tools and systems.
● Use version control systems (e.g., Git) to manage LookML code changes and collaborate with other developers.
● Provide training and support to business users, helping them navigate and use Looker effectively.
● Diagnose and resolve technical issues related to Looker, data models, and reports.
Skills Required:
● Experience in Looker's modeling language, LookML, including data models, dimensions, and measures.
● Strong SQL skills for writing and optimizing database queries across different SQL databases (GCP/BQ preferable)
● Knowledge of data modeling best practices
● Proficient in BigQuery, billing data analysis, GCP billing, unit costing, and invoicing, with the ability to recommend cost optimization strategies.
● Previous experience in Finops engagements is a plus
● Proficiency in ETL processes for data transformation and preparation.
● Ability to create effective data visualizations and reports using Looker’s dashboard tools.
● Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing.
● Familiarity with related tools and technologies, such as data warehousing (e.g., BigQuery ), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python).

Similar companies
About the company
We are a fast growing virtual & hybrid events and engagement platform. Gevme has already powered hundreds of thousands of events around the world for clients like Facebook, Netflix, Starbucks, Forbes, MasterCard, Citibank, Google, Singapore Government etc.
We are a SAAS product company with a strong engineering and family culture; we are always looking for new ways to enhance the event experience and empower efficient event management. We’re on a mission to groom the next generation of event technology thought leaders as we grow.
Join us if you want to become part of a vibrant and fast moving product company that's on a mission to connect people around the world through events.
Jobs
5
About the company
Jobs
17
About the company
We’ve achieved 300% growth in the cybersecurity field. We’ve partnered with the most impactful organisations - whether it is the largest cybersecurity company or the fastest growing unicorn. We're building an amazing place to work and we invite you to join us.
Engineering Centric Culture
At Metron, we strive to maintain and cultivate a developer-centric culture. Here’s what this means in action:
Meaningful and ever-evolving work
We’ve built integrations and custom solutions for 200+ security platforms and keep adding more to the list. You will be constantly learning about new platforms, acquiring new skills, and being exposed to cutting-edge security technology and other tools.
Work directly with clients and end-users
We encourage our developers to have a client-facing approach. At Metron, you will be able to demonstrate and discuss your code with the people who will be using it on a regular basis.
A balanced workspace
Everyone has a personal life outside of the office. We do not expect you to work weekends or overtime (and actively encourage you not to do so). We also aim to provide you with all the required perks and benefits to keep you in good health and spirits.
A level hierarchy
There are only 2 roles in the organisation - Development and QA. Every employee will either be writing code or testing for quality. We don’t believe in helicopter managers who hover above you and do not understand your work.
Hybrid work mode
Like to work at the office? We can accommodate you. Prefer to work from home? No problem! Need a combination of both? Whatever model works for you, works for us.
Jobs
4
About the company
About company
Geotrackers, is a technology company offering end to end solutions that help organizations manage their field resources more effectively, be they vehicles, assets or personnel. We provide GPS based vehicle tracking solutions and mobile solutions for field force management. Our solutions have gained in popularity owing to the multiple benefits that they offer, starting with increase in productivity of the field resources, reduction in costs of field operations, better customer service and better safety & security for man & material. All our solutions are cloud based, offered on the SaaS model. They are therefore easy to deploy and economical to use. Our primary targets are organizations with a sizeable fleet of vehicle or sales & service personnel. Our solutions are used by over 1000 organizations across industry sectors like logistics, transportation, food processing, mining, cash logistics and many other industries, such as TATA Steel, ICICI Bank, DTDC, IOCL and many more. We are head quartered at Delhi, with office at Pune, Mumbai, Chennai, Kolkata, Indore, Bangalore, Hyderabad, Ahemdabad, Raipur, Kanpur, Chandigarh.
Jobs
7
About the company
Shoppin' - if Pinterest and Google had an AI-powered baby 🍓
Here’s why:
shoppin' isn't just another AI powered tech platform.
we’re crafting a groundbreaking fashion search and shopping platform, one that outshines Google’s search capabilities, offers more intuitive recommendations than Pinterest, and engages users more deeply than Instagram’s feeds.
Our vision is bold — we’re not just building a foundational ai for fashion; we're aiming for the stars with a full-fledged fashion AGI.
Achieving such heights demands a remarkable team. Over the past month, our office has buzzed with energy and ideas, pushing us closer to crafting a user experience that feels as natural as it is innovative.
🚀 We have raised a funding of $1M pre-seed and will be raising another round very soon. Our aim is built the best founding team out there and we are scaling up our hiring efforts rapidly.
🍓Apply now to become a part of our creative force dedicated to shaping the future of fashion.
Jobs
3
About the company
Jobs
6
About the company
Ven Analytics – Derived from the Sanskrit word वेण् meaning “to know”. We are a media focused Analytics company and we build applications and dashboards beyond imagination. Our services and products are laser focused to solve Analytics problems FOR Broadcasters, Agencies and Advertisers.
Agile Startup
Our agility helps us percolate through the hierarchy and rapidly deploy solutions for bottommost contributor
Best of Both
A through understanding of business and technology sets us apart from our competitors
Young and Vibrant
Being a young organization we are eager to take up new challenges and make sure we custom deliver them as requested
VISION
Our vision is to become the destination AdTech company for the entire M&E sector
MISSION
We are on a perpetual mission to solve big problems and deliver exponential value to our clients
Jobs
5
About the company
Jobs
133
About the company
Jobs
10
About the company
At DAITA, we’re on a mission to revolutionize the fashion supply chain with intelligent AI solutions. Our focus is simple: automate the repetitive, so fashion brands can focus on what they do best—creativity, strategy, and growth.
We’ve done the groundwork—research across 8 countries and 3 continents, collaborating with everyone from cotton farmers to global retailers. This hands-on experience allows us to build AI tools that solve real-world problems with precision and empathy. And what truly sets us apart? Simplicity. Our solutions work out of the box, with zero learning curve and setup in under 24 hours.
Currently in our MVP stage and backed by pre-seed funding, we’re building something ambitious. Led by founders Jérôme Schmidt (ex-Snapchat AR) and Luis Voss (ex-Enpal), DAITA isn’t just another tech startup—we’re reimagining the world’s second-largest industry, one AI solution at a time.
Jobs
1