Cutshort logo
Data Axle
Data Axle cover picture
Founded :
1972
Type :
Services
Size :
1000-5000
Stage :
Profitable

About

Data Axle is a data-driven marketing solutions provider that helps clients with clean data, lead generation, strategy development, campaign design, and day-to-day execution needs. It solves the problem of inaccurate and incomplete data, enabling businesses to make informed decisions and drive growth. Data Axle operates in various industries, including healthcare, finance, retail, and technology.


About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission

critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.


Data Axle India is recognized as a Great Place to Work!

This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.

Read more

Tech stack

skill iconPython
skill iconDjango
skill iconFlask
skill iconReact.js
skill iconAngularJS (1.x)
GraphQL
skill iconAmazon Web Services (AWS)

Candid answers by the company

What does the company do?
What is the location preference of jobs?

Data Axle helps clients create a foundation of clean data, find new leads, develop winning strategies, design beautiful campaigns, and can handle all day-to-day execution needs.

Company social profiles

linkedin

Jobs at Data Axle

Data Axle
at Data Axle
2 candid answers
Eman Khan
Posted by Eman Khan
Pune
3 - 5 yrs
Best in industry
skill iconPython
skill iconDjango
skill iconFlask
skill iconReact.js
skill iconAngularJS (1.x)

About Data Axle

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global center of excellence in Pune. This center delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.

 

Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive. 

 

General Summary

We are looking for a Full stack developer who will be responsible for:

 

Roles & Responsibilities

  • Implement application components and systems according to department standards and guidelines.
  • Work with product and designers to translate requirements into accurate representations for the web.
  • Analyze, design, code, debug, and test business applications.
  • Code reviews in accordance with team processes/standards.
  • Perform other miscellaneous duties as assigned by Management.


Qualifications

  • 3+ years of Software Engineering experience required.
  • Bachelor’s degree in computer science, Engineering, or a related field.
  • Experience in developing web applications using Django. Strong knowledge of ReactJS and related libraries such as Redux. Proficient in HTML, CSS,and JavaScript.
  • Experience in working with SQL databases such as MySQL.
  • Strong problem-solving skills and attention to detail.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Data Axle
at Data Axle
2 candid answers
Eman Khan
Posted by Eman Khan
Pune
6 - 9 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconPython
SQL
PySpark
XGBoost

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.


Data Axle Pune is pleased to have achieved certification as a Great Place to Work!


Roles & Responsibilities:

We are looking for a Senior Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.


We are looking for a Senior Data Scientist who will be responsible for:

  1. Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
  2. Design or enhance ML workflows for data ingestion, model design, model inference and scoring
  3. Oversight on team project execution and delivery
  4. Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
  5. Visualize and publish model performance results and insights to internal and external audiences


Qualifications:

  1. Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
  2. Minimum of 5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  3. Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
  4. Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
  5. Proficiency in Python and SQL required; PySpark/Spark experience a plus
  6. Ability to conduct a productive peer review and proper code structure in Github
  7. Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
  8. Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.


It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Data Axle
at Data Axle
2 candid answers
Eman Khan
Posted by Eman Khan
Pune
7 - 10 yrs
Best in industry
Google Cloud Platform (GCP)
ETL
skill iconPython
skill iconJava
skill iconScala
+4 more

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 45 years in the USA. Data Axle has set up a strategic global center of excellence in Pune. This center delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases. Data Axle is headquartered in Dallas, TX, USA.


Roles and Responsibilities:

  • Design, implement, and manage scalable analytical data infrastructure, enabling efficient access to large datasets and high-performance computing on Google Cloud Platform (GCP).
  • Develop and optimize data pipelines using GCP-native services like BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Data Fusion, and Cloud Storage.
  • Work with diverse data sources to extract, transform, and load data into enterprise-grade data lakes and warehouses, ensuring high availability and reliability.
  • Implement and maintain real-time data streaming solutions using Pub/Sub, Dataflow, and Kafka.
  • Research and integrate the latest big data and visualization technologies to enhance analytics capabilities and improve efficiency.
  • Collaborate with cross-functional teams to implement machine learning models and AI-driven analytics solutions using Vertex AI and BigQuery ML.
  • Continuously improve existing data architectures to support scalability, performance optimization, and cost efficiency.
  • Enhance data security and governance by implementing industry best practices for access control, encryption, and compliance.
  • Automate and optimize data workflows to simplify reporting, dashboarding, and self-service analytics using Looker and Data Studio.


Basic Qualifications

  • 7+ years of experience in data engineering, software development, business intelligence, or data science, with expertise in large-scale data processing and analytics.
  • Strong proficiency in SQL and experience with BigQuery for data warehousing.
  • Hands-on experience in designing and developing ETL/ELT pipelines using GCP services (Cloud Composer, Dataflow, Dataproc, Data Fusion, or Apache Airflow).
  • Expertise in distributed computing and big data processing frameworks, such as Apache Spark, Hadoop, or Flink, particularly within Dataproc and Dataflow environments.
  • Experience with business intelligence and data visualization tools, such as Looker, Tableau, or Power BI.
  • Knowledge of data governance, security best practices, and compliance requirements in cloud environments.


Preferred Qualifications:

  • Degree/Diploma in Computer Science, Engineering, Mathematics, or a related technical field.
  • Experience working with GCP big data technologies, including BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud SQL.
  • Hands-on experience with real-time data processing frameworks, including Kafka and Apache Beam.
  • Proficiency in Python, Java, or Scala for data engineering and pipeline development.
  • Familiarity with DevOps best practices, CI/CD pipelines, Terraform, and infrastructure-as-code for managing GCP resources.
  • Experience integrating AI/ML models into data workflows, leveraging BigQuery ML, Vertex AI, or TensorFlow.
  • Understanding of Agile methodologies, software development life cycle (SDLC), and cloud cost optimization strategies.
Read more
Data Axle
at Data Axle
2 candid answers
Eman Khan
Posted by Eman Khan
Pune
9 - 12 yrs
Best in industry
skill iconPython
PySpark
skill iconMachine Learning (ML)
SQL
skill iconData Science
+1 more

Roles & Responsibilities:  

We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.  


We are looking for a Lead Data Scientist who will be responsible for  

  • Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture  
  • Design or enhance ML workflows for data ingestion, model design, model inference and scoring 3. Oversight on team project execution and delivery  
  • Establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies  
  • Visualize and publish model performance results and insights to internal and external audiences  


Qualifications:  

  • Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)  
  • Minimum of 9+ years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  • Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)  
  • Proficiency in Python and SQL required; PySpark/Spark experience a plus  
  • Ability to conduct a productive peer review and proper code structure in Github
  • Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)  
  • Working knowledge of modern CI/CD methods  


This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level. 

Read more
Data Axle
at Data Axle
2 candid answers
Eman Khan
Posted by Eman Khan
Remote, Pune
8 - 13 yrs
Best in industry
Data architecture
Systems design
Spark
Apache Kafka
Flink
+5 more

About Data Axle:

 

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.

Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.

 

General Summary:

 

As a Digital Data Management Architect, you will design, implement, and optimize advanced data management systems that support processing billions of digital transactions, ensuring high availability and accuracy. You will leverage your expertise in developing identity graphs, real-time data processing, and API integration to drive insights and enhance user experiences across digital platforms. Your role is crucial in building scalable and secure data architectures that support real-time analytics, identity resolution, and seamless data flows across multiple systems and applications.

 

Roles and Responsibilities:

 

  1. Data Architecture & System Design:
  • Design and implement scalable data architectures capable of processing billions of digital transactions in real-time, ensuring low latency and high availability.
  • Architect data models, workflows, and storage solutions to enable seamless real-time data processing, including stream processing and event-driven architectures.
  1. Identity Graph Development:
  • Lead the development and maintenance of a comprehensive identity graph to unify disparate data sources, enabling accurate identity resolution across channels.
  • Develop algorithms and data matching techniques to enhance identity linking, while maintaining data accuracy and privacy.
  1. Real-Time Data Processing & Analytics:
  • Implement real-time data ingestion, processing, and analytics pipelines to support immediate data availability and actionable insights.
  • Work closely with engineering teams to integrate and optimize real-time data processing frameworks such as Apache Kafka, Apache Flink, or Spark Streaming.
  1. API Development & Integration:
  • Design and develop real-time APIs that facilitate data access and integration across internal and external platforms, focusing on security, scalability, and performance.
  • Collaborate with product and engineering teams to define API specifications, data contracts, and SLAs to meet business and user requirements.
  1. Data Governance & Security:
  • Establish data governance practices to maintain data quality, privacy, and compliance with regulatory standards across all digital transactions and identity graph data.
  • Ensure security protocols and access controls are embedded in all data workflows and API integrations to protect sensitive information.
  1. Collaboration & Stakeholder Engagement:
  • Partner with data engineering, analytics, and product teams to align data architecture with business requirements and strategic goals.
  • Provide technical guidance and mentorship to junior architects and data engineers, promoting best practices and continuous learning.

 

 

Qualifications:

 

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • 10+ years of experience in data architecture, digital data management, or a related field, with a proven track record in managing billion+ transactions.
  • Deep experience with identity resolution techniques and building identity graphs.
  • Strong proficiency in real-time data processing technologies (e.g., Kafka, Flink, Spark) and API development (RESTful and/or GraphQL).
  • In-depth knowledge of database systems (SQL, NoSQL), data warehousing solutions, and cloud-based platforms (AWS, Azure, or GCP).
  • Familiarity with data privacy regulations (e.g., GDPR, CCPA) and data governance best practices.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Data Axle
at Data Axle
2 candid answers
Eman Khan
Posted by Eman Khan
Pune
6 - 9 yrs
Best in industry
skill iconPython
skill iconDjango
skill iconFlask
skill iconReact.js
GraphQL
+2 more

General Summary:

The Senior Software Engineer will be responsible for designing, developing, testing, and maintaining full-stack solutions. This role involves hands-on coding (80% of time), performing peer code reviews, handling pull requests and engaging in architectural discussions with stakeholders. You'll contribute to the development of large-scale, data-driven SaaS solutions using best practices like TDD, DRY, KISS, YAGNI, and SOLID principles. The ideal candidate is an experienced full-stack developer who thrives in a fast-paced, Agile environment.


Essential Job Functions:

  • Design, develop, and maintain scalable applications using Python and Django.
  • Build responsive and dynamic user interfaces using React and TypeScript.
  • Implement and integrate GraphQL APIs for efficient data querying and real-time updates.
  • Apply design patterns such as Factory, Singleton, Observer, Strategy, and Repository to ensure maintainable and scalable code.
  • Develop and manage RESTful APIs for seamless integration with third-party services.
  • Design, optimize, and maintain SQL databases like PostgreSQL, MySQL, and MSSQL.
  • Use version control systems (primarily Git) and follow collaborative workflows.
  • Work within Agile methodologies such as Scrum or Kanban, participating in daily stand-ups, sprint planning, and retrospectives.
  • Write and maintain unit tests, integration tests, and end-to-end tests, following Test-Driven Development (TDD).
  • Collaborate with cross-functional teams, including Product Managers, DevOps, and UI/UX Designers, to deliver high-quality products


Essential functions are the basic job duties that an employee must be able to perform, with or without reasonable accommodation. The function is considered essential if the reason the position exists is to perform that function.


Supportive Job Functions:

  • Remain knowledgeable of new emerging technologies and their impact on internal systems.
  • Available to work on call when needed.
  • Perform other miscellaneous duties as assigned by management.


These tasks do not meet the Americans with Disabilities Act definition of essential job functions and usually equal 5% or less of time spent. However, these tasks still constitute important performance aspects of the job.


Skills

  • The ideal candidate must have strong proficiency in Python and Django, with a solid understanding of Object-Oriented Programming (OOPs) principles. Expertise in JavaScript,
  • TypeScript, and React is essential, along with hands-on experience in GraphQL for efficient data querying.
  • The candidate should be well-versed in applying design patterns such as Factory, Singleton, Observer, Strategy, and Repository to ensure scalable and maintainable code architecture.
  • Proficiency in building and integrating REST APIs is required, as well as experience working with SQL databases like PostgreSQL, MySQL, and MSSQL.
  • Familiarity with version control systems (especially Git) and working within Agile methodologies like Scrum or Kanban is a must.
  • The candidate should also have a strong grasp of Test-Driven Development (TDD) principles.
  • In addition to the above, it is good to have experience with Next.js for server-side rendering and static site generation, as well as knowledge of cloud infrastructure such as AWS or GCP.
  • Familiarity with NoSQL databases, CI/CD pipelines using tools like GitHub Actions or Jenkins, and containerization technologies like Docker and Kubernetes is highly desirable.
  • Experience with microservices architecture and event-driven systems (using tools like Kafka or RabbitMQ) is a plus, along with knowledge of caching technologies such as Redis or
  • Memcached. Understanding OAuth2.0, JWT, SSO authentication mechanisms, and adhering to API security best practices following OWASP guidelines is beneficial.
  • Additionally, experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation, and familiarity with performance monitoring tools such as New Relic or Datadog will be considered an advantage.


Abilities:

  • Ability to organize, prioritize, and handle multiple assignments on a daily basis.
  • Strong and effective inter-personal and communication skills
  • Ability to interact professionally with a diverse group of clients and staff.
  • Must be able to work flexible hours on-site and remote.
  • Must be able to coordinate with other staff and provide technological leadership.
  • Ability to work in a complex, dynamic team environment with minimal supervision.
  • Must possess good organizational skills.


Education, Experience, and Certification:

  • Associate or bachelor’s degree preferred (Computer Science, Engineer, etc.), but equivalent work experience in a technology related area may substitute.
  • 2+ years relevant experience, required.
  • Experience using version control daily in a developer environment.
  • Experience with Python, JavaScript, and React is required.
  • Experience using rapid development frameworks like Django or Flask.
  • Experience using front end build tools.


Scope of Job:

  1. No direct reports.
  2. No supervisory responsibility.
  3. Consistent work week with minimal travel
  4. Errors may be serious, costly, and difficult to discover.
  5. Contact with others inside and outside the company is regular and frequent.
  6. Some access to confidential data.
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

HighLevel Inc. cover picture
HighLevel Inc.'s logo

HighLevel Inc.

https://gohighlevel.com
Founded
2018
Type
Product
Size
100-500
Stage
Profitable

About the company

We are the fastest growing all-in-one platform for SMB's and digital marketing agencies. We offer services related to CRM, Email, 2-way SMS, phone system, Facebook, Instagram, WhatsApp, Email marketing, Social media posting, Websites, Funnel Builder, Wordpress hosting & more!


We have a very strong and independent team. We value tinkerers and people with an entrepreneurial spirit. We want people to come to work and explore their curiosity every day. Our growth offers a unique opportunity for the right individual to scale and build world class products.


About HighLevel:  

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have 1000+ employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We work at scale; our infrastructure handles around 3 Billion+ API hits & 2 Billion+ message events monthly and over 25M views of customer pages daily. We also handle over 80 Terabytes of data across 5 Databases.


About the Team:

Currently we have millions of sales funnels, websites, attributions, forms and survey tools for lead generation. Our B2B customers use these tools to bring in the leads to the HighLevel CRM system. We are working to continuously improve the functionality of these tools to solve our customers’ business needs. In this role, you will be expected to be autonomous, guide other developers who might need technical help, collaborate with other technical teams, product, support and customer success


Some of the perks we offer:

  • 100 % remote
  • Uncapped leave policy
  • WFH setup
  • Champion big problems

Jobs

15

amIT global solutions cover picture
amIT global solutions's logo

amIT global solutions

https://www.amitglobal.com/
Founded
2011
Type
Services
Size
Stage
Profitable

About the company

amIT Global Solutions (AGS) is an Information Technology Solution and Services Company provides a wide range of services including Professional Services, Managed Services and Business Process Outsourcing. We have highly qualified team of vibrant experts in wide range of technology and solutions in all verticals to help our customers.

Jobs

1

QAgile Services cover picture
QAgile Services's logo

QAgile Services

https://qagile.co.in/
Founded
2021
Type
Products & Services
Size
0-10
Stage
Profitable

About the company


|

At Qagile Services, we are dedicated to empowering businesses through cutting-edge IT solutions and unparalleled staffing services. Our mission is to bridge the gap between technology and talent, ensuring that your organization has the resources it needs to thrive in a rapidly evolving digital landscape.



Jobs

10

Moshi Moshi cover picture
Moshi Moshi's logo

Moshi Moshi

http://www.moshimoshi.in
Founded
2014
Type
Services
Size
100-1000
Stage
Profitable

About the company

Founded in 2014 by two passionate individuals during their second year at Christ College, Bangalore, Moshi Moshi is a young, creative, and committed communication company that encourages clients to always "Expect the EXTRA." 


Our diverse team of over 160+ people includes Art directors, Cinematographers, Content and copy writers, marketers, developers, coders, and our beloved puppy, Momo. We offer a wide range of services, including strategy, brand design, communications, packaging, film and TVCs, PR, and more. At Moshi Moshi, we believe in creating experiences rather than just running a company.​


We are amongst the fastest growing agencies in the country with a very strong value system.

Below are the five of the nine principles we believe in strongly.

  1. Communicate Clearly.: Prioritize clear and open dialogue.
  2. Doing things morally right.: Uphold integrity in all endeavors.
  3. Dream it, do it.: Always Embrace optimism and a can-do attitude.
  4. Add logic to your life.: Ensure that rationality guides our actions.
  5. Be that fool.: Fearlessly challenge the impossible.


Come find yourself at Moshi Moshi.

Jobs

29

NeoGenCode Technologies Pvt Ltd cover picture
NeoGenCode Technologies Pvt Ltd's logo

NeoGenCode Technologies Pvt Ltd

http://www.neogencode.com
Founded
2023
Type
Services
Size
0-10
Stage
Raised funding

About the company

Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.

Jobs

104

Moative cover picture
Moative's logo

Moative

https://www.moative.com/
Founded
2024
Type
Products & Services
Size
0-20
Stage
Bootstrapped

About the company

Are you ready to be at the forefront of the AI revolution? Moative is your gateway to reshaping industries through cutting-edge Applied AI Services and innovative Venture Labs.


Moative is an AI company that focuses on automating tasks, compressing workflows, predicting demand, pricing intelligently, and delighting customers. They design AI roadmaps, build co-pilots, and create predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries.


🚀 What We Do

At Moative, we're not just using AI – we're redefining its potential. Our mission is to empower businesses in energy, utilities, packaging, commerce, and other primary industries with AI solutions that drive unprecedented productivity and growth.


🔬 Our Expertise:

  • Design tailored AI roadmaps
  • Build intelligent co-pilots for specialists
  • Develop predictive AI solutions
  • Launch AI micro-products through Moative Labs


💡 Why Moative?

  1. Innovation at Core: We're constantly pushing the boundaries of AI technology.
  2. Industry Impact: Our solutions directly influence the cost of goods sold, helping clients surpass industry profit margins.
  3. Customized Approach: We fine-tune fundamental AI models to create unique, intelligent systems for each client.
  4. Continuous Learning: Our systems evolve and improve, ensuring long-term value.


🧑‍🦰Founding Team

Shrikanth and Ashwin, IIT-Madras alumni have been riding technology waves since the dotcom era. Our latest venture, PipeCandy (Data & Predictions on 12 million eCommerce sellers) was acquired in 2021. We have built analytical models for industrial giants, advised enterprise AI platforms on competitive positioning, and built 70 member AI team for our portfolio companies since 2023.

Jobs

3

Gruve cover picture
Gruve's logo

Gruve

https://www.gruve.ai/
Founded
2024
Type
Services
Size
100-1000
Stage
Bootstrapped

About the company

Gruve was founded on the premise that new technologies in Machine Learning, Data Sciences, Artificial Intelligence, and Software Development are transforming Enterprise Services. Our goal is to harness these advancements to deliver services with superior efficiency and tangible outcomes.


Our Team

Our team is built with a strong background in Software and Services, united by a shared sense of Purpose: to achieve the best outcomes for our clients. We value all our stakeholders, recognizing that People are our most important assets. We adopt a Process framework that ensures the delivery of high-quality results every time.


What Sets Us Apart

Our differentiation is straightforward: we genuinely care, we innovate, we disrupt, and we work hard.


Our Core Values:

Customer Success: Putting customers first.

Positive Feedback Loop: Embracing continuous improvement.

Pursuit & Persevere: Staying resilient and ambitious.

Integrity and Ethics: Acting with honesty and ethics.

Team & Trust: Collaborating with trust and respect.

Giving Back: Committing to community and responsibility.


Gruve is Norwegian for "To Mine or Mining Activity"

Jobs

15

Founded
Type
Size
Stage

About the company

Jobs

7

Enan Tech Private Limited cover picture
Enan Tech Private Limited's logo

Enan Tech Private Limited

https://wowpe.in/homepage
Founded
2023
Type
Products & Services
Size
20-100
Stage
Bootstrapped

About the company

Jobs

4

Aatish Management Consultants OPC Pvt Ltd cover picture
Aatish Management Consultants OPC Pvt Ltd's logo

Aatish Management Consultants OPC Pvt Ltd

https://www.aatishmanagement.in/
Founded
2012
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

Jobs

16

Want to work at Data Axle?
Data Axle's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs