Cutshort logo
DataMetica
DataMetica cover picture
Founded :
2013
Type :
Services
Size :
100-1000
Stage :
Profitable

About

As a global leader in Data Warehouse Migration, Data Modernization, and Data Analytics, we empower businesses through automation and help you attain excellence. Our Belief is to Empowering companies to master their businesses and helping them achieve their full potential, we nurture clients with our innovative frameworks. Our embedded values help us strengthen the bond with our clients, ensuring growth for all. Datametica is a preferred partner with leading cloud vendors. We offer solutions related to migration from current Enterprise Data Warehouses to the Cloud determining which of these is best suited to your needs. We are giving Data Wings.
Read more

Company video

DataMetica's video section
DataMetica's video section

Photos

Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures
Company featured pictures

Connect with the team

Profile picture
Sumangali Desai
Profile picture
Shivani Mahale
Profile picture
Nitish Saxena
Profile picture
Nikita Aher
Profile picture
Pooja Gaikwad
Profile picture
Sayali Kachi
Profile picture
syed raza

Company social profiles

bloglinkedintwitterfacebook

Jobs at DataMetica

DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
2 - 6 yrs
₹3L - ₹15L / yr
Google Cloud Platform (GCP)
SQL
BQ

Datametica is looking for talented Big Query engineers

 

Total Experience - 2+ yrs.

Notice Period – 0 - 30 days

Work Location – Pune, Hyderabad

 

Job Description:

  • Sound understanding of Google Cloud Platform Should have worked on Big Query, Workflow, or Composer
  • Experience in migrating to GCP and integration projects on large-scale environments ETL technical design, development, and support
  • Good SQL skills and Unix Scripting Programming experience with Python, Java, or Spark would be desirable.
  • Experience in SOA and services-based data solutions would be advantageous

 

About the Company: 

www.datametica.com

Datametica is amongst one of the world's leading Cloud and Big Data analytics companies.

Datametica was founded in 2013 and has grown at an accelerated pace within a short span of 8 years. We are providing a broad and capable set of services that encompass a vision of success, driven by innovation and value addition that helps organizations in making strategic decisions influencing business growth.

Datametica is the global leader in migrating Legacy Data Warehouses to the Cloud. Datametica moves Data Warehouses to Cloud faster, at a lower cost, and with few errors, even running in parallel with full data validation for months.

Datametica's specialized team of Data Scientists has implemented award-winning analytical models for use cases involving both unstructured and structured data.

Datametica has earned the highest level of partnership with Google, AWS, and Microsoft, which enables Datametica to deliver successful projects for clients across industry verticals at a global level, with teams deployed in the USA, EU, and APAC.

 

Recognition:

We are gratified to be recognized as a Top 10 Big Data Global Company by CIO story.

 

If it excites you, please apply.

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
2 - 6 yrs
₹5L - ₹20L / yr
skill iconJava
J2EE
RESTful APIs
Maven

Experience: 2-6 Years

 

Datametica is looking for talented Java engineers who would get training & the opportunity to work on Cloud and Big Data Analytics.


Job Responsibilities:

  • 2-6 years, having hands-on experience in coding, usually in a pair programming environment providing solutions to real problems in the Big Data world
  • Working in highly collaborative teams and building quality code
  • Working in lots of different domains and client environments also understanding the business domain deeply
  • Engineers highly scalable, highly available, reliable, secure, and fault-tolerant systems with minimal guidance
  • Create platforms, reusable libraries, and utilities wherever applicable
  • Continuously refactor applications to ensure high-quality design
  • Choose the right technology stack for the product systems/subsystems
  • Write high-quality code that are modular, functional, and testable; Establish the best coding practices
  • Formally mentor junior engineers on design, coding, and troubleshooting
  • Plan projects using agile methodologies and ensure timely delivery
  • Communicate, collaborate and work effectively in a global environment

 

Required Skills:

  • Proficient in Core Java technology stack
  • Design and implement low latency RESTful services; Define API contracts between services; Expertise in API design and development, experience in dealing with a large dataset
  • Should have worked on Spring boot
  • Practicing the coding standards (clean code, design patterns, etc)
  • Good understanding of branching, build, deployment, continuous integration methodologies
  • Ability to make decisions independently
  • Strong experience in collections.

 

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
15 - 20 yrs
₹25L - ₹50L / yr
Engineering Management
Engineering Manager
Engineering Director
Engineering Head
VP of Engineering
+9 more

As a Director Engineering, your role & responsibility will include the following.  

  • Define the product roadmap and delivery planning. 
  • Provides technical leadership in design, delivery, and support of product software and platforms. 
  • Participate in driving technical architectural decisions for the product.  
  • Prioritization of the features and deliverable artifacts 
  • Augmentation of the product development staff.  
  • Mentor managers to implement best practices to motivate and organize their  teams. 
  • Prepare schedules, report status as well as make hiring decisions. 
  • Ensure to provide proven ability to evaluate and improve software development  best practices. 
  • Provide DevOps and other processes to assure consistency, quality and  timeliness. 
  • Participate in interviewing as well as hiring final decisions. 
  • Guide and provide input to all strategic as well as technical planning for entire  products. 
  • Monitor and provide input for evaluation and prioritize change requests.
  • Create and monitor the set of policies that establish standard development  languages, tools, and methodology; documentation practices; and examination  procedures for developed systems to ensure alignment with overall architecture.' 
  • Participate in project scope, schedule, and cost reviews. 
  • Understand and socialize product capabilities and limitations. 
  • Identify and implement ways to improve and promote quality and demonstrate  accuracy and thoroughness. 
  • Establish working relationships with external technology vendors. 
  • Integrate customer requirements through the engineering effort for championing  next generation products. 
  • Quickly gain an understanding of the company's technology and markets,  establish yourself as a credible leader. 
  • Release scheduling. 
  • Keeps abreast of new technologies and has demonstrated knowledge and  experience in various technologies. 
  • Manage 3rd party consulting partners/vendors implementing products. 
  • Prepare and submit weekly project status reports; prepare monthly reports  outlining team assignments and/or changes, project status changes, and  forecasts project timelines.
  • Provide leadership to individuals or team(s) through coaching, feedback,  development goals, and performance management. 
  • Prioritize employee career development to grow the internal pipeline of leadership  talent. 
  • Prioritize, assign, and manage department activities and projects in accordance  with the department's goals and objectives. Adjust hours of work, priorities, and  staff assignments to ensure efficient operation, based on workload. 

 

Qualification & Experience  

  • Master’s or bachelor’s degree in Computer Science, Business Information  Systems or related field or equivalent work experience required. 
  • Relevant certifications also preferred among other indications of someone who  values continuing education. 
  • 15+ years’ experience "living" with various operating systems, development tools  and development methodologies including Java, data structures, Scala, Python,  NodeJS 
  • 8+ years of individual contributor software development experience.
  • 6+ years management experience in a fast-growing product software  environment with proven ability to lead and engage development, QA and  implementation teams working on multiple projects. 
  • Idea generation and creativity in this position are a must, as are the ability to  work with deadlines, manage and complete projects on time and within budget. 
  • Proven ability to establish and drive processes and procedures with quantifiable  metrics to measure the success and effectiveness of the development  organization. 
  • Proven history of delivering on deadlines/releases without compromising quality. 
  • Mastery of engineering concepts and core technologies: development models,  programming languages, databases, testing, and documentation. 
  • Development experience with compilers, web Services, database engines and  related technologies. 
  • Experience with Agile software development and SCRUM methodologies. 
  • Proven track record of delivering high quality software products. 
  • A solid engineering foundation indicated by a demonstrated understanding of  
  • product design, life cycle, software development practices, and support services.  Understanding of standard engineering processes and software development  methodologies. 
  • Experience coordinating the work and competences of software staff within  functional project groups. 
  • Ability to work cross functionally and as a team with other executive committee  members. 
  • Strong verbal and written communication skills. 
  • Communicate effectively with different business units about technology and  processes using lay terms and descriptions.  
  •  
  • Experience Preferred: 
  • Experience building horizontally scalable solutions leveraging containers,  microservices, Big Data technologies among other related technologies. 
  • Experience working with graphical user experience and user interface design. 
  • Experience working with object-oriented software development, web services,  web development or other similar technical products. 
  • Experience with database engines, languages, and compilers  
  • Experience with user acceptance testing, regression testing and integration  testing. 
  • Experience working on open-source software projects for Apache and other great  open-source software organizations. 
  • Demonstrable experience training and leading teams as a great people leader.
  •  
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
2 - 7 yrs
₹3L - ₹15L / yr
Shell Scripting
SQL
PL/SQL
Linux/Unix
Hiring Alert!!

Opportunity for Unix Developer!!

We at Datametica are looking for talented Unix engineers who would get trained and will get the opportunity to work on Google Cloud Platform, DWH and Big Data.

Experience - 2 to 7 years
Job location - Pune

Mandatory Skills:
Strong experience in Unix with Shell Scripting development.

What opportunities do we offer?
-Selected candidates will be provided training opportunities in one or more of following: Google Cloud, AWS, DevOps Tools and Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- You would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- You will play an active role in setting up the Modern data platform based on Cloud and Big Data
- You would be a part of teams with rich experience in various aspects of distributed systems and computing.
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
6 - 12 yrs
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
4 - 7 yrs
₹5L - ₹15L / yr
ETL
Informatica PowerCenter
Teradata
Data Warehouse (DWH)
IBM InfoSphere DataStage
Requirement -
  • Must have 4 to 7 years of experience in ETL Design and Development using Informatica Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.
Opportunities-
  • • Selected candidates will be provided training opportunities on one or more of following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and
  • Kafka would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing.
Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
6 - 10 yrs
₹10L - ₹20L / yr
Cloud Computing
IT infrastructure
VPN
Firewall
IT security
+8 more

Experience: 6+ years

Location: Pune

 

Lead/support implementation of core cloud components and document critical design and configuration details to support enterprise cloud initiatives. The Cloud Infrastructure Lead will be primarily responsible for utilizing technical skills to coordinate enhancements and deployment efforts and to provide insight and recommendations for implementing client solutions. Leads will work closely with Customer, Cloud Architect, other Cloud teams and other functions.


Job Requirements:

  • Experience in Cloud Foundation setup from the hierarchy of Organization to individual services
  • Experience in Cloud Virtual Network, VPN Gateways, Tunneling, Cloud Load Balancing, Cloud Interconnect, Cloud DNS
  • Experience working with scalable networking technologies such as Load Balancers/Firewalls and web standards (REST APIs, web security mechanisms)
  • Experience working with Identity and access management (MFA, SSO, AD Connect, App Registrations, Service Principals)
  • Familiarity with standard IT security practices such as encryption, certificates and key management.
  • Experience in Deploying and maintaining Applications on Kubernetes
  • Must have worked on one or more configuration tools such an Terraform, Ansible, PowerShell DSC
  • Experience on Cloud Storage services including MS SQL DB, Tables, Files etc
  • Experience on Cloud Monitoring and Alerts mechanism
  • Well versed with Governance and Cloud best practices for Security and Cost optimization 
  • Experience in one or more of the following: Shell scripting, PowerShell, Python or Ruby. 
  • Experience with Unix/Linux operating systems internals and administration (e.g., filesystems, system calls) or networking (e.g., TCP/IP, routing, network topologies and hardware, SDN)
  • Should have strong knowledge of Cloud billing and understand costing of different cloud services 
  • Prior professional experience in IT Strategy, IT Business Management, Cloud & Infrastructure, or Systems Engineering

 

Preferred 

  • Compute: Infrastructure, Platform Sizing, Consolidation, Tiered and Virtualized Storage, Automated Provisioning, Rationalization, Infrastructure Cost Reduction, Thin Provisioning
  • Experience with Operating systems and Software
  • Sound background with Networking and Security 
  • Experience with Open Source: Sizing and Performance Analyses, Selection and Implementation, Platform Design and Selection
  • Experience with Infrastructure-Based Processes: Monitoring, Capacity Planning, Facilities Management, Performance Tuning, Asset Management, Disaster Recovery, Data Center support

 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune
3 - 8 yrs
₹5L - ₹20L / yr
ETL
Data Warehouse (DWH)
IBM InfoSphere DataStage
DataStage
SQL
+1 more

Datametica is Hiring for Datastage Developer

  • Must have 3 to 8 years of experience in ETL Design and Development using IBM Datastage Components.
  • Should have extensive knowledge in Unix shell scripting.
  • Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
  • Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
  • Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
  • Should be good in writing complex SQL queries.

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

www.datametica.com

 

Read more
DataMetica
at DataMetica
1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Pune, Hyderabad
7 - 12 yrs
₹12L - ₹33L / yr
Big Data
Hadoop
Spark
Apache Spark
Apache Hive
+3 more

Job description

Role : Lead Architecture (Spark, Scala, Big Data/Hadoop, Java)

Primary Location : India-Pune, Hyderabad

Experience : 7 - 12 Years

Management Level: 7

Joining Time: Immediate Joiners are preferred


  • Attend requirements gathering workshops, estimation discussions, design meetings and status review meetings
  • Experience of Solution Design and Solution Architecture for the data engineer model to build and implement Big Data Projects on-premises and on cloud.
  • Align architecture with business requirements and stabilizing the developed solution
  • Ability to build prototypes to demonstrate the technical feasibility of your vision
  • Professional experience facilitating and leading solution design, architecture and delivery planning activities for data intensive and high throughput platforms and applications
  • To be able to benchmark systems, analyses system bottlenecks and propose solutions to eliminate them
  • Able to help programmers and project managers in the design, planning and governance of implementing projects of any kind.
  • Develop, construct, test and maintain architectures and run Sprints for development and rollout of functionalities
  • Data Analysis, Code development experience, ideally in Big Data Spark, Hive, Hadoop, Java, Python, PySpark,
  • Execute projects of various types i.e. Design, development, Implementation and migration of functional analytics Models/Business logic across architecture approaches
  • Work closely with Business Analysts to understand the core business problems and deliver efficient IT solutions of the product
  • Deployment sophisticated analytics program of code using any of cloud application.


Perks and Benefits we Provide!


  • Working with Highly Technical and Passionate, mission-driven people
  • Subsidized Meals & Snacks
  • Flexible Schedule
  • Approachable leadership
  • Access to various learning tools and programs
  • Pet Friendly
  • Certification Reimbursement Policy
  • Check out more about us on our website below!

www.datametica.com

Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

Fractal Analytics cover picture
Fractal Analytics's logo

Fractal Analytics

https://fractalanalytics.com
Founded
2000
Type
Products & Services
Size
100-1000
Stage
Profitable

About the company

Fractal is one of the most prominent players in the Artificial Intelligence space.Fractal's mission is to power every human decision in the enterprise and brings Al, engineering, and design to help the world's most admire Fortune 500® companies.


Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Manage- ment, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs.


Fractal has more than 3600 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, a leader in Analytics & Al Services Specialists Peak Matrix 2021 by Everest Group and recognized as an "Honorable Vendor" in 2022 Magic Quadrant™™ for data & analytics by Gartner. For more information, visit fractal.ai

Jobs

4

JoVE cover picture
JoVE's logo

JoVE

http://www.jove.com
Founded
2006
Type
Product
Size
100-500
Stage
Profitable

About the company

JoVE is the leading producer and publisher of video resources with the mission to increase the productivity of research and education in science, medicine, and engineering. Established in 2006 as the world's first peer-reviewed scientific video journal, JoVE has produced over 19,000 videos demonstrating experiments filmed in laboratories at top research institutions and delivered online to millions of scientists, educators, and students worldwide. Today, JoVE subscribers include more than 1,700 universities, colleges, biotech companies, and pharmaceutical companies. Headquartered in Cambridge, Massachusetts, JoVE maintains offices in the United States, Europe, India, and Australia. Please visit www.jove.com to learn more.

Jobs

3

Vola Finance cover picture
Vola Finance's logo

Vola Finance

https://www.volafinance.com/
Founded
2017
Type
Product
Size
20-100
Stage
Raised funding

About the company

We're a Fintech Platform that came into light in 2017 by providing our users with instant cash advances. Seven-plus years on, we've completed our Series A funding, and we are on track to do over $15 million in revenue this year. We're looking to grow the team and scale the Company further in every vertical.


In the last seven years, Vola Finance has broadened its offering to more than just your regular cash advance app. We aim to provide a comprehensive solution for managing personal finances, especially for those who might struggle with traditional banking services.

Jobs

3

Prismberry Technologies Pvt Ltd cover picture
Prismberry Technologies Pvt Ltd's logo

Prismberry Technologies Pvt Ltd

https://www.prismberry.com/
Founded
2018
Type
Services
Size
100-1000
Stage
Raised funding

About the company

Prismberry is a leading provider of software & automation services to diverse industries. We specializes in Software Development and IT Services, with expertise in be-spoke automation & cloud-based software solutions for diverse industries. We are dedicated to delivering innovative solutions that transform businesses.


Prismberry is a proud partner of Google Cloud Platform. Our highly skilled engineers have expertise in cloud automation, application migration to the cloud, application development, data analytics, and enterprise software development that helps companies to stay focussed on core problem solving and thus enhancing the speed of execution of projects.


We understand value data can bring in and have good expertise in design solution to capture, store and evaluate data as required for customers.


Specialties

Software design and development services, Cloud software design and development, Data Analytics, Data Science, Application migration to cloud, Cloud infrastructure software, Artificial Intelligence, System design, Deep Learning infrastructure software, Enterprise server and storage software, Metrics and logging, Prometheus, DevOps, and CloudOps

Jobs

2

Universal Transit cover picture
Universal Transit's logo

Universal Transit

http://www.universaltransit.com
Founded
2020
Type
Services
Size
20-100
Stage
Raised funding

About the company

Reliability It is our duty to keep up with the expectations and trust of the clients. We leave no detail unattended in ensuring that you have the best and the most convenient experience here at Universal Transit. Professionalism All of our services exhibit professionalism. We never adopt procedures and mechanisms that fall short of modern industry standards. Teamwork Dealing with massive details and management tasks of car transportation is highly demanding work. This is where we achieve maximum efficiency through our collaborative work and teamwork spirit.

Jobs

1

Nvelop cover picture
Nvelop's logo

Nvelop

http://www.nvelop.ai
Founded
2024
Type
Product
Size
0-20
Stage
Raised funding

About the company

Nvelop is a pioneer in AI-powered, automated IT sourcing, delivering more efficient procurement, faster time-to-market, and enhanced compliance. Nvelop develops an AI-native SaaS platform for sourcing supporting end-to-end IT sourcing processes including solution exploration, requirements gathering, RFP generation, proposal evaluation, and contracting. Nvelop was founded in 2024. The company is based in Helsinki, Finland. Learn more at www.nvelop.ai.

Jobs

1

Sim Gems Group cover picture
Sim Gems Group's logo

Sim Gems Group

http://www.simgems.com
Founded
1993
Type
Product
Size
20-100
Stage
Profitable

About the company

Sim Gems Group is a leading diamond manufacturer, miner, wholesaler, and distributor of natural diamonds. Established in 1993, the company is dedicated to providing customers with cut and polished stones of the highest quality and unmatched brilliance. They focus on ethical sourcing, following the Kimberley Process Certification Scheme, and prioritize sustainable practices in their operations.

Jobs

0

Indee cover picture
Indee's logo

Indee

https://indee.tv/
Founded
2014
Type
Product
Size
20-100
Stage
Raised funding

About the company

Indee, founded in 2014 and headquartered in West Hollywood, California, is a cutting-edge entertainment technology company that solves content security, management, and distribution challenges for the world's largest entertainment companies. With clients including Netflix, Disney, Paramount Pictures, Lionsgate, Showtime, and Comedy Central, Indee provides a one-stop solution for production houses, OTT platforms, television networks, and guilds. The company's proprietary security layer and unmatched analytics have led to impressive growth, with a 10X increase in revenue and a 400X growth in the number of screenings over the last three years. Indee specializes in SaaS, content distribution, media security, 4K streaming, global distribution screenings, and forensic watermarking, making it a crucial player in the entertainment industry's digital ecosystem.

Jobs

2

DataCaliper cover picture
DataCaliper's logo

DataCaliper

https://www.datacaliper.com
Founded
2008
Type
Services
Size
20-100
Stage
Raised funding

About the company

Jobs

0

Founded
1950
Type
Products & Services
Size
500+
Stage
Profitable

About the company

Jobs

3

Want to work at DataMetica?
DataMetica's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs