Cutshort logo
DataMetica
DataMetica cover picture
Founded :
2013
Type :
Services
Size :
100-1000
Stage :
Profitable

About

As a global leader in Data Warehouse Migration, Data Modernization, and Data Analytics, we empower businesses through automation and help you attain excellence. Our Belief is to Empowering companies to master their businesses and helping them achieve their full potential, we nurture clients with our innovative frameworks. Our embedded values help us strengthen the bond with our clients, ensuring growth for all. Datametica is a preferred partner with leading cloud vendors. We offer solutions related to migration from current Enterprise Data Warehouses to the Cloud determining which of these is best suited to your needs. We are giving Data Wings.
Read more

Company video

DataMetica's video section
DataMetica's video section

Photos

Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures

Connect with the team

Profile picture
Sumangali Desai
Profile picture
Shivani Mahale
Profile picture
Nitish Saxena
Profile picture
Nikita Aher
Profile picture
Pooja Gaikwad
Profile picture
Sayali Kachi
Profile picture
syed raza

Company social profiles

bloglinkedintwitterfacebook

Jobs at DataMetica

DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
2 - 6 yrs
₹3L - ₹15L / yr
Google Cloud Platform (GCP)
SQL
BQ

Datametica is looking for talented Big Query engineers

 

Total Experience - 2+ yrs.

Notice Period – 0 - 30 days

Work Location – Pune, Hyderabad

 

Job Description:

  • Sound understanding of Google Cloud Platform Should have worked on Big Query, Workflow, or Composer
  • Experience in migrating to GCP and integration projects on large-scale environments ETL technical design, development, and support
  • Good SQL skills and Unix Scripting Programming experience with Python, Java, or Spark would be desirable.
  • Experience in SOA and services-based data solutions would be advantageous

 

About the Company: 

www.datametica.com

Datametica is amongst one of the world's leading Cloud and Big Data analytics companies.

Datametica was founded in 2013 and has grown at an accelerated pace within a short span of 8 years. We are providing a broad and capable set of services that encompass a vision of success, driven by innovation and value addition that helps organizations in making strategic decisions influencing business growth.

Datametica is the global leader in migrating Legacy Data Warehouses to the Cloud. Datametica moves Data Warehouses to Cloud faster, at a lower cost, and with few errors, even running in parallel with full data validation for months.

Datametica's specialized team of Data Scientists has implemented award-winning analytical models for use cases involving both unstructured and structured data.

Datametica has earned the highest level of partnership with Google, AWS, and Microsoft, which enables Datametica to deliver successful projects for clients across industry verticals at a global level, with teams deployed in the USA, EU, and APAC.

 

Recognition:

We are gratified to be recognized as a Top 10 Big Data Global Company by CIO story.

 

If it excites you, please apply.

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
2 - 6 yrs
₹5L - ₹20L / yr
skill iconJava
J2EE
RESTful APIs
Maven

Experience: 2-6 Years

 

Datametica is looking for talented Java engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.


Job Responsibilities:

  • 2-6 years, having hands-on experience in coding, usually in a pair programming environment providing solutions to real problems in the Big Data world
  • Working in highly collaborative teams and building quality code
  • Working in lots of different domains and client environments also understanding the business domain deeply
  • Engineers highly scalable, highly available, reliable, secure, and fault-tolerant systems with minimal guidance
  • Create platforms, reusable libraries, and utilities wherever applicable
  • Continuously refactor applications to ensure high-quality design
  • Choose the right technology stack for the product systems/subsystems
  • Write high-quality code that are modular, functional, and testable; Establish the best coding practices
  • Formally mentor junior engineers on design, coding, and troubleshooting
  • Plan projects using agile methodologies and ensure timely delivery
  • Communicate, collaborate and work effectively in a global environment

 

Required Skills:

  • Proficient in Core Java technology stack
  • Design and implement low latency RESTful services; Define API contracts between services; Expertise in API design and development, experience in dealing with a large dataset
  • Should have worked on Spring boot
  • Practicing the coding standards (clean code, design patterns, etc)
  • Good understanding of branching, build, deployment, continuous integration methodologies
  • Ability to make decisions independently
  • Strong experience in collections.

 

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
15 - 20 yrs
₹25L - ₹50L / yr
Engineering Management
Engineering Manager
Engineering Director
Engineering Head
VP of Engineering
+9 more

As a Director Engineering, your role & responsibility will include the following.  

  • Define the product roadmap and delivery planning. 
  • Provides technical leadership in design, delivery, and support of product software and platforms. 
  • Participate in driving technical architectural decisions for the product.  
  • Prioritization of the features and deliverable artifacts 
  • Augmentation of the product development staff.  
  • Mentor managers to implement best practices to motivate and organize their  teams. 
  • Prepare schedules, report status as well as make hiring decisions. 
  • Ensure to provide proven ability to evaluate and improve software development  best practices. 
  • Provide DevOps and other processes to assure consistency, quality and  timeliness. 
  • Participate in interviewing as well as hiring final decisions. 
  • Guide and provide input to all strategic as well as technical planning for entire  products. 
  • Monitor and provide input for evaluation and prioritize change requests.
  • Create and monitor the set of policies that establish standard development  languages, tools, and methodology; documentation practices; and examination  procedures for developed systems to ensure alignment with overall architecture.' 
  • Participate in project scope, schedule, and cost reviews. 
  • Understand and socialize product capabilities and limitations. 
  • Identify and implement ways to improve and promote quality and demonstrate  accuracy and thoroughness. 
  • Establish working relationships with external technology vendors. 
  • Integrate customer requirements through the engineering effort for championing  next generation products. 
  • Quickly gain an understanding of the company's technology and markets,  establish yourself as a credible leader. 
  • Release scheduling. 
  • Keeps abreast of new technologies and has demonstrated knowledge and  experience in various technologies. 
  • Manage 3rd party consulting partners/vendors implementing products. 
  • Prepare and submit weekly project status reports; prepare monthly reports  outlining team assignments and/or changes, project status changes, and  forecasts project timelines.
  • Provide leadership to individuals or team(s) through coaching, feedback,  development goals, and performance management. 
  • Prioritize employee career development to grow the internal pipeline of leadership  talent. 
  • Prioritize, assign, and manage department activities and projects in accordance  with the department's goals and objectives. Adjust hours of work, priorities, and  staff assignments to ensure efficient operation, based on workload. 

 

Qualification & Experience  

  • Master’s or bachelor’s degree in Computer Science, Business Information  Systems or related field or equivalent work experience required. 
  • Relevant certifications also preferred among other indications of someone who  values continuing education. 
  • 15+ years’ experience "living" with various operating systems, development tools  and development methodologies including Java, data structures, Scala, Python,  NodeJS 
  • 8+ years of individual contributor software development experience.
  • 6+ years management experience in a fast-growing product software  environment with proven ability to lead and engage development, QA and  implementation teams working on multiple projects. 
  • Idea generation and creativity in this position are a must, as are the ability to  work with deadlines, manage and complete projects on time and within budget. 
  • Proven ability to establish and drive processes and procedures with quantifiable  metrics to measure the success and effectiveness of the development  organization. 
  • Proven history of delivering on deadlines/releases without compromising quality. 
  • Mastery of engineering concepts and core technologies: development models,  programming languages, databases, testing, and documentation. 
  • Development experience with compilers, web Services, database engines and  related technologies. 
  • Experience with Agile software development and SCRUM methodologies. 
  • Proven track record of delivering high quality software products. 
  • A solid engineering foundation indicated by a demonstrated understanding of  
  • product design, life cycle, software development practices, and support services.  Understanding of standard engineering processes and software development  methodologies. 
  • Experience coordinating the work and competences of software staff within  functional project groups. 
  • Ability to work cross functionally and as a team with other executive committee  members. 
  • Strong verbal and written communication skills. 
  • Communicate effectively with different business units about technology and  processes using lay terms and descriptions.  
  •  
  • Experience Preferred: 
  • Experience building horizontally scalable solutions leveraging containers,  microservices, Big Data technologies among other related technologies. 
  • Experience working with graphical user experience and user interface design. 
  • Experience working with object-oriented software development, web services,  web development or other similar technical products. 
  • Experience with database engines, languages, and compilers  
  • Experience with user acceptance testing, regression testing and integration  testing. 
  • Experience working on open-source software projects for Apache and other great  open-source software organizations. 
  • Demonstrable experience training and leading teams as a great people leader.
  •  
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
2 - 7 yrs
₹3L - ₹15L / yr
Shell Scripting
SQL
PL/SQL
Linux/Unix
Hiring Alert!!

Opportunity for Unix Developer!!

We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.

Experience - 2 to 7 years
Job location - Pune

Mandatory Skills:
Strong experience in Unix with Shell Scripting development.

What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
6 - 12 yrs
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
4 - 7 yrs
₹5L - ₹15L / yr
ETL
Informatica PowerCenter
Teradata
Data Warehouse (DWH)
IBM InfoSphere DataStage
Requirement -
  • Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.
Opportunities-
  • • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
  • Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing.
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
6 - 10 yrs
₹10L - ₹20L / yr
Cloud Computing
IT infrastructure
VPN
Firewall
IT security
+8 more

Experience: 6+ years

Location: Pune

 

Lead/support implementation of core cloud components and document critical design and configuration details to support enterprise cloud initiatives. The Cloud Infrastructure Lead will be primarily responsible for utilizing technical skills to coordinate enhancements and deployment efforts and to provide insight and recommendations for implementing client solutions. Leads will work closely with Customer, Cloud Architect, other Cloud teams and other functions.


Job Requirements:

  • Experience in Cloud Foundation setup from the hierarchy of Organization to individual services
  • Experience in Cloud Virtual Network, VPN Gateways, Tunneling, Cloud Load Balancing, Cloud Interconnect, Cloud DNS
  • Experience working with scalable networking technologies such as Load Balancers/Firewalls and web standards (REST APIs, web security mechanisms)
  • Experience working with Identity and access management (MFA, SSO, AD Connect, App Registrations, Service Principals)
  • Familiarity with standard IT security practices such as encryption, certificates and key management.
  • Experience in Deploying and maintaining Applications on Kubernetes
  • Must have worked on one or more configuration tools such an Terraform, Ansible, PowerShell DSC
  • Experience on Cloud Storage services including MS SQL DB, Tables, Files etc
  • Experience on Cloud Monitoring and Alerts mechanism
  • Well versed with Governance and Cloud best practices for Security and Cost optimization 
  • Experience in one or more of the following: Shell scripting, PowerShell, Python or Ruby. 
  • Experience with Unix/Linux operating systems internals and administration (e.g., filesystems, system calls) or networking (e.g., TCP/IP, routing, network topologies and hardware, SDN)
  • Should have strong knowledge of Cloud billing and understand costing of different cloud services 
  • Prior professional experience in IT Strategy, IT Business Management, Cloud & Infrastructure, or Systems Engineering

 

Preferred 

  • Compute: Infrastructure, Platform Sizing, Consolidation, Tiered and Virtualized Storage, Automated Provisioning, Rationalization, Infrastructure Cost Reduction, Thin Provisioning
  • Experience with Operating systems and Software
  • Sound background with Networking and Security 
  • Experience with Open Source: Sizing and Performance Analyses, Selection and Implementation, Platform Design and Selection
  • Experience with Infrastructure-Based Processes: Monitoring, Capacity Planning, Facilities Management, Performance Tuning, Asset Management, Disaster Recovery, Data Center support

 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
3 - 8 yrs
₹5L - ₹20L / yr
ETL
Data Warehouse (DWH)
IBM InfoSphere DataStage
DataStage
SQL
+1 more

Datametica is Hiring for Datastage Developer

  • Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

 

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
7 - 12 yrs
₹12L - ₹33L / yr
Big Data
Hadoop
Spark
Apache Spark
Apache Hive
+3 more

Job description

Role : Lead Architecture (Spark, Scala, Big Data/Hadoop, Java)

Primary Location : India-Pune, Hyderabad

Experience : 7 - 12 Years

Management Level: 7

Joining Time: Immediate Joiners are preferred


  • Attend requirements gathering workshops, estimation discussions, design meetings and status review meetings
  • Experience of Solution Design and Solution Architecture for the data engineer model to build and implement Big Data Projects on-premises and on cloud.
  • Align architecture with business requirements and stabilizing the developed solution
  • Ability to build prototypes to demonstrate the technical feasibility of your vision
  • Professional experience facilitating and leading solution design, architecture and delivery planning activities for data intensive and high throughput platforms and applications
  • To be able to benchmark systems, analyses system bottlenecks and propose solutions to eliminate them
  • Able to help programmers and project managers in the design, planning and governance of implementing projects of any kind.
  • Develop, construct, test and maintain architectures and run Sprints for development and rollout of functionalities
  • Data Analysis, Code development experience, ideally in Big Data Spark, Hive, Hadoop, Java, Python, PySpark,
  • Execute projects of various types i.e. Design, development, Implementation and migration of functional analytics Models/Business logic across architecture approaches
  • Work closely with Business Analysts to understand the core business problems and deliver efficient IT solutions of the product
  • Deployment sophisticated analytics program of code using any of cloud application.


Perks and Benefits we Provide!


  • Working with Highly Technical and Passionate, mission-driven people
  • Subsidized Meals & Snacks
  • Flexible Schedule
  • Approachable leadership
  • Access to various learning tools and programs
  • Pet Friendly
  • Certification Reimbursement Policy
  • Check out more about us on our website below!

www.datametica.com

Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

Famepilot Internet Private Limited cover picture
Famepilot Internet Private Limited's logo

Famepilot Internet Private Limited

http://www.famepilot.com
Founded
2018
Type
Product
Size
0-20
Stage
Bootstrapped

About the company

Famepilot is an AI/ML-powered cloud Platform for businesses and brands to monitor and manage their customers feedback across all online (Search, Listing, Social, Review sites, and Online surveys) and offline (on-premise survey, kiosk, on tablets and paper forms) channels. Businesses of any size, from 1 location to 10,000+ locations can utilize the platform to uplift their customer experience, engagement, and reputation. We gather real-time ratings & reviews and convert them into actionable insights for businesses using AI-based recommendation engine. We believe listening to customers is the most fundamental activity to succeed in any business. Famepilot closes the sales cycle loop by asking the customer feedback post-transaction, and helping businesses responding on-time. Famepilot converts reviews into revenue. Famepilot is a strategic and thoughtful start-up in Online Reputation Space with decades of combined industry experience.

Jobs

3

Certa cover picture
Certa's logo

Certa

https://www.getcerta.com
Founded
2018
Type
Products & Services
Size
100-1000
Stage
Raised funding

About the company

Certa’s no-code platform makes it easy to digitize and manage the lifecycle of all your suppliers, partners, and customers. With automated onboarding, contract lifecycle management, and ESG management, Certa eliminates the procurement bottleneck and allows companies to onboard third-parties 3x faster.

Jobs

3

Datazoic Machines Pvt Ltd  cover picture
Datazoic Machines Pvt Ltd 's logo

Datazoic Machines Pvt Ltd

https://www.datazoic.com
Founded
2017
Type
Product
Size
100-500
Stage
Bootstrapped

About the company

Home | Datazoic

Jobs

1

Rapidsoft Technologies cover picture
Rapidsoft Technologies's logo

Rapidsoft Technologies

http://www.rapidsofttechnologies.com
Founded
2006
Type
Products & Services
Size
50-200
Stage
Profitable

About the company

Rapidsoft Technologies is a global software solution firm that offers total software solutions, services, and products for mobile and web platforms. They discover, design, and deliver simple to complex mobile and web solutions for diverse business verticals from across the world. Rapidsoft Technologies empowers enterprises to derive productivity, perform to their limits, and produce great revenue.

Jobs

8

Data Axle cover picture
Data Axle's logo

Data Axle

https://data-axle.com/
Founded
1972
Type
Services
Size
1000-5000
Stage
Profitable

About the company

Data Axle is a product company that offers various data and technology solutions, including software-as-a-service (SaaS) and data-as-a-service (DaaS). These solutions help businesses manage and leverage data for marketing, sales, and business intelligence.


They are data-driven marketing solutions provider that helps clients with clean data, lead generation, strategy development, campaign design, and day-to-day execution needs. It solves the problem of inaccurate and incomplete data, enabling businesses to make informed decisions and drive growth. Data Axle operates in various industries, including healthcare, finance, retail, and technology.


About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission

critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.


Data Axle India is recognized as a Great Place to Work!

This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.

Jobs

7

Inncircles  cover picture
Inncircles 's logo

Inncircles

https://www.inncircles.com/
Founded
2019
Type
Product
Size
100-500
Stage
Profitable

About the company

Inncircles Arena – a Modern Construction Management Platform, is a composed one stop solution for construction management that is highly configurable, easily adaptable, data centric while leveraging advanced artificial intelligence and cloud technologies to ramp up construction company's digital transformation initiatives. It empowers construction teams to go beyond the tactical to strategic. Through the platform, residential, industrial and infrastructure construction companies gain real-time insights into their projects, a unified communication and collaboration space, powerful configuration capabilities to integrate with existing or legacy tools and technologies. Detailed products and solution packages allow for planning, scheduling, assigning, tracking, and analyzing construction work packages from pre-planning stage to handover. Solving Construction Business Challenges through Technology • Work Management: Elevate efficiency and transform project management by scheduling works, tracking progress while gaining real-time insights. • Quality Management: Ensure projects meet the required quality standards by performing regular inspections and issue tracking. • Safety Management: Record and monitor safety incidents in real-time, schedule inspections, and store pertinent safety documentation. • Advanced Work Package (AWP): Build package of works, schedule work packages and monitor progress. • Data and Analytics: Gain profound insights of projects through visually stunning dashboards that bring your data to life. • Mobile Application: Offer mobile accessibility to enable field teams to access information, submit updates, and communicate in real-time, improving overall efficiency. • Integrations: Foster connectivity in the construction eco system with Arenas potent integration capabilities. • Pre-Construction Planning: Drive project planning, budgeting, estimating, bidding, and collaboration during the pre-construction phase. • Procurement: Create procurement packages

Jobs

9

Risosu Consulting LLP cover picture
Risosu Consulting LLP's logo

Risosu Consulting LLP

https://www.risosu.com
Founded
2023
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

Jobs

3

OpenIAM cover picture
OpenIAM's logo

OpenIAM

http://www.openiam.com
Founded
2008
Type
Product
Size
20-100
Stage
Bootstrapped

About the company

OpenIAM is a pioneering Identity and Access Management (IAM) solutions provider that has been transforming enterprise security since 2008. Based in New York, this self-funded and profitable company has established itself as an innovator in the IAM space, being the first to introduce a converged architecture stack and fully containerized suite for cloud environments. With a global presence and partnerships with major systems integrators like Thales and Indra, OpenIAM serves mid to large enterprises across various sectors including financial services, healthcare, education, and manufacturing.

Jobs

0

Pattern Agentix cover picture
Pattern Agentix's logo

Pattern Agentix

https://www.patternagentix.com
Founded
2024
Type
Product
Size
0-20
Stage
Raised funding

About the company

Jobs

3

Founded
2025
Type
Product
Size
0-20
Stage
Bootstrapped

About the company

Jobs

1

Want to work at DataMetica?
DataMetica's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs