11+ Data marts Jobs in India
Apply to 11+ Data marts Jobs on CutShort.io. Find your next job, effortlessly. Browse Data marts Jobs and apply today!
JOB DETAILS:
* Job Title: Associate III - Data Engineering
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4-6 years
* Location: Trivandrum, Kochi
Job Description
Job Title:
Data Services Engineer – AWS & Snowflake
Job Summary:
As a Data Services Engineer, you will be responsible for designing, developing, and maintaining robust data solutions using AWS cloud services and Snowflake.
You will work closely with cross-functional teams to ensure data is accessible, secure, and optimized for performance.
Your role will involve implementing scalable data pipelines, managing data integration, and supporting analytics initiatives.
Responsibilities:
• Design and implement scalable and secure data pipelines on AWS and Snowflake (Star/Snowflake schema)
• Optimize query performance using clustering keys, materialized views, and caching
• Develop and maintain Snowflake data warehouses and data marts.
• Build and maintain ETL/ELT workflows using Snowflake-native features (Snowpipe, Streams, Tasks).
• Integrate Snowflake with cloud platforms (AWS, Azure, GCP) and third-party tools (Airflow, dbt, Informatica)
• Utilize Snowpark and Python/Java for complex transformations
• Implement RBAC, data masking, and row-level security.
• Optimize data storage and retrieval for performance and cost-efficiency.
• Collaborate with stakeholders to gather data requirements and deliver solutions.
• Ensure data quality, governance, and compliance with industry standards.
• Monitor, troubleshoot, and resolve data pipeline and performance issues.
• Document data architecture, processes, and best practices.
• Support data migration and integration from various sources.
Qualifications:
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• 3 to 4 years of hands-on experience in data engineering or data services.
• Proven experience with AWS data services (e.g., S3, Glue, Redshift, Lambda).
• Strong expertise in Snowflake architecture, development, and optimization.
• Proficiency in SQL and Python for data manipulation and scripting.
• Solid understanding of ETL/ELT processes and data modeling.
• Experience with data integration tools and orchestration frameworks.
• Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
• AWS Glue, AWS Lambda, Amazon Redshift
• Snowflake Data Warehouse
• SQL & Python
Skills: Aws Lambda, AWS Glue, Amazon Redshift, Snowflake Data Warehouse
Must-Haves
AWS data services (4-6 years), Snowflake architecture (4-6 years), SQL (proficient), Python (proficient), ETL/ELT processes (solid understanding)
Skills: AWS, AWS lambda, Snowflake, Data engineering, Snowpipe, Data integration tools, orchestration framework
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
Strong AI & Full-Stack Tech Lead
Mandatory (Experience 1): Must have 5+ years of experience in full-stack development, including Python for backend development and React/JavaScript for frontend, along with API/microservice integration.
Mandatory (Experience 2): Must have 2+ years of experience in leading technical teams, coordinating engineers, and acting as a system integrator across distributed teams.
Mandatory (Experience 3): Must have 1+ year of hands-on experience in AI projects, including LLMs, Transformers, LangChain, or OpenAI/Azure AI frameworks.
Mandatory (Tech Skills 1): Must have experience in designing and implementing AI workflows, including RAG pipelines, vector databases, and prompt orchestration.
Mandatory (Tech Skills 2): Must ensure backend and AI system scalability, reliability, observability, and security best practices.
Mandatory (Company): Must have experience working in B2B SaaS companies delivering AI, automation, or enterprise productivity solutions
Tech Skills (Familiarity): Should be familiar with integrating AI systems with enterprise platforms (SharePoint, Teams, Databricks, Azure) and enterprise SaaS environmentsx
Mandatory (Note): Both founders are based out of Australia, design (2) and developer (4) team in India. Indian shift timings.
3-4 years WordPress
Good with UI Design frontend
Good with AI tools
Skills:
WordPress, PHP-backend, css, mySql, Javascript, Responsive UI, SEO
Plus points.
- Have experience on betting/prediction type websites.
- Have SEO related knowledge.
- Good knowledge with javascript.
💡 The Role
We’re hiring a Revenue Operations Specialist who will act as the backbone of our go-to-market engine.
This is a builder role — not a maintenance role. You’ll design systems, automate workflows, and implement AI-driven processes across Sales, BD, and Marketing.
🤖 What You’ll Do
• Build AI agents and automation workflows across the revenue stack
• Use tools like Zapier, Make, n8n, Clay, and LLM-based workflows
• Own CRM (HubSpot/Salesforce) including structure, hygiene, and reporting
• Build dashboards to track pipeline health and performance
• Manage integrations across CRM, enrichment, outreach, and analytics tools
• Design lead lifecycle processes from acquisition to conversion
• Ensure seamless coordination between Sales, BD, and Marketing
🧠 Who You Are
• 3–5 years in Revenue Ops / Sales Ops / GTM Ops
• Strong CRM experience (HubSpot or Salesforce)
• Hands-on or strong interest in AI workflows and automation
• Experience with tools like Zapier, Make, n8n, Clay, Apollo, Outreach
• Strong analytical and systems-thinking mindset
• Startup-ready: self-driven, fast, and comfortable with ambiguity
• Clear communicator who can document and simplify processes
✨ Nice to Have
• Experience with AI/LLM tools or APIs
• Background in B2B SaaS or startup environments
• Experience with lead scoring or enrichment tools
About Us:
MyOperator is a Business AI Operator and a category leader that unifies WhatsApp, Calls, and AI-powered chat & voice bots into one intelligent business communication platform.
Unlike fragmented communication tools, MyOperator combines automation, intelligence, and workflow integration to help businesses run WhatsApp campaigns, manage calls, deploy AI chatbots, and track performance — all from a single, no-code platform.Trusted by 12,000+ brands including Amazon, Domino’s, Apollo, and Razorpay, MyOperator enables faster responses, higher resolution rates, and scalable customer engagement — without fragmented tools or increased headcount.
Role Overview:
We’re seeking a passionate Python Developer with strong experience in backend development and cloud infrastructure. This role involves building scalable microservices, integrating AI tools like LangChain/LLMs, and optimizing backend performance for high-growth B2B products.
Key Responsibilities:
- Develop robust backend services using Python, Django, and FastAPI
- Design and maintain a scalable microservices architecture
- Integrate LangChain/LLMs into AI-powered features
- Write clean, tested, and maintainable code with pytest
- Manage and optimize databases (MySQL/Postgres)
- Deploy and monitor services on AWS
- Collaborate across teams to define APIs, data flows, and system architecture
Must-Have Skills:
- Python and Django
- MySQL or Postgres
- Microservices architecture
- AWS (EC2, RDS, Lambda, etc.)
- Unit testing using pytest
- LangChain or Large Language Models (LLM)
- Strong grasp of Data Structures & Algorithms
- AI coding assistant tools (e.g., Chat GPT & Gemini)
Good to Have:
- MongoDB or ElasticSearch
- Go or PHP
- FastAPI
- React, Bootstrap (basic frontend support)
- ETL pipelines, Jenkins, Terraform
Why Join Us?
- 100% Remote role with a collaborative team
- Work on AI-first, high-scale SaaS products
- Drive real impact in a fast-growing tech company
- Ownership and growth from day one
You will be involved throughout the product lifecycle, from idea generation, design, and prototyping to execution, and shipping.
You'll collaborate closely with technical and non-technical counterparts to understand our customers' problems and build products that solve them.
Desired Skills :
● Total 1+ years of coding experience
● Experience with data structures and databases (SQL or NoSQL)
● Strong coder with proficiency in at least one programming language, such as Java, GoLang or NodeJS
● Ability to learn and work independently and make decisions with minimal supervision.
#HiringAlert
We are looking "Software Engineer" for Reputed Client @ Permanent Role.
• Experience: 2 - 4 yrs
Skills :
Primary Skills
•2+ years of experience in Web API, C#.Net, OOPS and Entity framework
•Minimum 1+ years of experience on Web Application development HTML, CSS, JavaScript/JQuery, Entity framework and Linq Queries
•Must have a good exposure on query writing and DB management for writing stored procedures/ user-defined functions
Secondary Skills
•Experience in analysing existing code and debugging.
•Should have a solid understanding of the SDLC processes (Design, Construction, Testing, Deployment)
•Proven experience of delivering on-time and with quality
•Should have good unit testing skills to review his own development and identify all the defects and get it fixed before releasing the code
Desired Skills
•Should have experience in developing ERP applications or Database Intensive Data Entry applications.
•Been on a same role for a period of 2 years or more
•Hands on experience of configuration management and version maintenance
•Prior experience of working in the shipping domain
•Should be able to understand the Functional Specifications and Technical Specifications and develop the application as per the specification provided.
•Full stack developers with experience in Developing products with Angular and C# APIs, from ground-up will be a plus.
Location : Bangalore
Big Data Engineer/Data Engineer
What we are solving
Welcome to today’s business data world where:
• Unification of all customer data into one platform is a challenge
• Extraction is expensive
• Business users do not have the time/skill to write queries
• High dependency on tech team for written queries
These facts may look scary but there are solutions with real-time self-serve analytics:
• Fully automated data integration from any kind of a data source into a universal schema
• Analytics database that streamlines data indexing, query and analysis into a single platform.
• Start generating value from Day 1 through deep dives, root cause analysis and micro segmentation
At Propellor.ai, this is what we do.
• We help our clients reduce effort and increase effectiveness quickly
• By clearly defining the scope of Projects
• Using Dependable, scalable, future proof technology solution like Big Data Solutions and Cloud Platforms
• Engaging with Data Scientists and Data Engineers to provide End to End Solutions leading to industrialisation of Data Science Model Development and Deployment
What we have achieved so far
Since we started in 2016,
• We have worked across 9 countries with 25+ global brands and 75+ projects
• We have 50+ clients, 100+ Data Sources and 20TB+ data processed daily
Work culture at Propellor.ai
We are a small, remote team that believes in
• Working with a few, but only with highest quality team members who want to become the very best in their fields.
• With each member's belief and faith in what we are solving, we collectively see the Big Picture
• No hierarchy leads us to believe in reaching the decision maker without any hesitation so that our actions can have fruitful and aligned outcomes.
• Each one is a CEO of their domain.So, the criteria while making a choice is so our employees and clients can succeed together!
To read more about us click here:
https://bit.ly/3idXzs0" target="_blank">https://bit.ly/3idXzs0
About the role
We are building an exceptional team of Data engineers who are passionate developers and wants to push the boundaries to solve complex business problems using the latest tech stack. As a Big Data Engineer, you will work with various Technology and Business teams to deliver our Data Engineering offerings to our clients across the globe.
Role Description
• The role would involve big data pre-processing & reporting workflows including collecting, parsing, managing, analysing, and visualizing large sets of data to turn information into business insights
• Develop the software and systems needed for end-to-end execution on large projects
• Work across all phases of SDLC, and use Software Engineering principles to build scalable solutions
• Build the knowledge base required to deliver increasingly complex technology projects
• The role would also involve testing various machine learning models on Big Data and deploying learned models for ongoing scoring and prediction.
Education & Experience
• B.Tech. or Equivalent degree in CS/CE/IT/ECE/EEE 3+ years of experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions.
Must have (hands-on) experience
• Python and SQL expertise
• Distributed computing frameworks (Hadoop Ecosystem & Spark components)
• Must be proficient in any Cloud computing platforms (AWS/Azure/GCP) • Experience in in any cloud platform would be preferred - GCP (Big Query/Bigtable, Pub sub, Data Flow, App engine )/ AWS/ Azure
• Linux environment, SQL and Shell scripting Desirable
• Statistical or machine learning DSL like R
• Distributed and low latency (streaming) application architecture
• Row store distributed DBMSs such as Cassandra, CouchDB, MongoDB, etc
. • Familiarity with API design
Hiring Process:
1. One phone screening round to gauge your interest and knowledge of fundamentals
2. An assignment to test your skills and ability to come up with solutions in a certain time
3. Interview 1 with our Data Engineer lead
4. Final Interview with our Data Engineer Lead and the Business Teams
Preferred Immediate Joiners
Artifex HR is looking to hire an HR recruiter to manage our recruitment cycle. The job involves identifying potential hires, evaluating, interviewing candidates and post recruitment checks.
JOB RESPONSIBILITES:
- Sourcing candidate CVs from various job portals by posting ads and following up
- Placing job advertisements
- Using company’s database/ reference/ networks & teams
- Pre-screening activities before scheduling of interviews
- Co-ordinating between potential candidates during subsequent rounds
- Making referral checks for the new hires before they’ve been placed with the company
- Finalizing salaries and sending out offer letters to selected candidates
- Ensuring that the candidates join and are given a date of joining
JOB REQUIREMENTS:
- 6 months to 2 years of work experience as an IT Recruiter or similar role
- Experience with IT recruitment is preferred
- Degree in Human Resources Management, Organizational Psychology or relevant field
- Experience with sourcing techniques and familiarity with handling job portals
- Excellent verbal and written communication skills
EXPERIENCE: 6 months to 2 years
SALARY: Up to 20,000 per month
● Prepare Job Descriptions
● Work on all open positions
● Learn Sourcing through Naukri
● Evaluate CVs for the role
● Conduct the telephonic interviews
● Conduct Candidate Assessments
● Coordinate Hiring Process for shortlisted candidates
● Coordinate the Offer to Joining process
● Arrange Campus Placement Drives
● Understand Staffing & Placement Agencies Onboarding








