Cutshort logo

50+ SQL Jobs in Mumbai | SQL Job openings in Mumbai

Apply to 50+ SQL Jobs in Mumbai on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Wissen Technology

at Wissen Technology

4 recruiters
Pankhuri Shayad
Posted by Pankhuri Shayad
Mumbai, Pune
6 - 10 yrs
Best in industry
skill iconReact.js
skill icon.NET
skill iconC#
SQL

Position: Full-Stack Developer – React / C# / Python / SQL

Location: Mumbai / Pune

Experience: 6–8 Years

Employment Type: Full-time


About the Role

We are looking for a versatile Full-Stack Developer who has working experience in React, C# or Python, and SQL. The candidate doesn’t need to be an expert in all these technologies but should be comfortable taking end-to-end ownership of features with the support of modern AI tools.


Key Responsibilities

  • Develop, test, and maintain scalable frontend applications using React.
  • Build and integrate backend services using C# (.NET) or Python.
  • Write and optimize SQL queries, procedures, and data models.
  • Work closely with product and design teams to deliver high-quality features.
  • Use AI-assisted development tools (like GitHub Copilot / ChatGPT) to speed up coding, debugging, documentation, and solution design.
  • Participate in code reviews, troubleshooting, and performance improvements.
  • Ensure best practices in code quality, security, and deployment.


Required Skill Set

  • Frontend

React.js (Hooks, components, state management, API integration)


  • Backend (any one or both)

C# (.NET Core)

Python (FastAPI / Django / Flask)


  • Database

SQL (MySQL / PostgreSQL / MSSQL)

Experience writing queries, joins, stored procedures, and handling schemas


Good to Have

  • REST API development
  • Basic DevOps understanding (CI/CD, version control – Git)
  • Familiarity with cloud platforms (AWS/Azure/GCP)
  • Ability to learn quickly with AI tools and follow best practices
  • Problem-solving and ownership mindset


What We Are Looking For

  • Someone who can handle full-stack tasks with confidence
  • Not necessary to be an expert in everything
  • Curious, adaptable, and open to using AI tools to deliver faster
  • Strong communication skills and team collaboration
Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Bengaluru (Bangalore), Mumbai, Pune, Hyderabad, Chennai, Gurugram
5 - 7 yrs
₹10L - ₹15L / yr
skill iconJava
skill iconSpring Boot
Rest API
Microservices
SQL
+1 more
  • 5+ years of experience in Java backend development
  • Strong proficiency in Core Java (Java 8+)
  • Hands-on experience with multithreading, concurrency, and performance tuning
  • Strong understanding of data structures and algorithms
  • Experience with Spring Boot and REST API development
  • Experience in microservices architecture
  • Good understanding of SQL/NoSQL databases
  • Strong debugging and problem-solving skills
Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Bengaluru (Bangalore), Mumbai, Pune, Hyderabad, Chennai, Gurugram
5 - 6 yrs
₹10L - ₹13L / yr
skill iconPython
Object Oriented Programming (OOPs)
RESTful APIs
SQL
  • Demonstrated experience building production-grade applications with an emphasis on scalability, maintainability, and performance
  • Strong expertise in concurrency and parallelism, including: 
  • Multithreading and multiprocessing
  • Synchronous and asynchronous programming (e.g., async/await)
  • Designing for throughput, latency, and safe shared-state handling
  • Proven experience integrating with external systems via application interfaces, including:
  • Building and consuming RESTful APIs
  • Authentication/authorization patterns (e.g., API keys, OAuth where applicable)
  • Reliable integration patterns (timeouts, retries, idempotency, error handling)
  • Strong SQL skills, including the ability to write efficient, complex queries (joins, aggregations, window functions) and optimize performance where needed. 


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Bengaluru (Bangalore), Mumbai
4 - 8 yrs
Best in industry
Snowflake
Data Transformation Tool (DBT)
SQL
Snow flake schema
skill iconPython
+1 more

JD - 

 

We are looking for a strong Data Engineer having hands on experience in building pipelines using Snowflake and DBT.

Key Responsibilities:

  • Develop, maintain, and optimize data pipelines using DBT and SQL on Snowflake DB.
  • Collaborate with data analysts, QA and business teams to build scalable data models.
  • Implement data transformations, testing, and documentation within the DBT framework.
  • Work on Snowflake for data warehousing tasks, including data ingestion, query optimization, and performance tuning.
  • Use Python (preferred) for automation, scripting, and additional data processing as needed.

Required Skills:

  • 6+ years of experience in building data engineering pipelines.
  • Strong hands-on expertise with DBT and advanced SQL.
  • Experience working with modern columnar/MPP data warehouses, preferably Snowflake.
  • Knowledge of Python for data manipulation and workflow automation (preferred).
  • Good understanding of data modeling concepts, ETL/ELT processes, and best practice.
Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Chennai, Ahmedabad
4 - 6 yrs
₹8L - ₹15L / yr
ASP.NET
.net core
mvc
skill iconC#
SQL
+13 more

Position: Microsoft .NET Full Stack Developer

Experience: 4–6 Years

Open Positions: 10

Location: PAN India (Final Round – Face-to-Face Interview)

Budget: Up to 15 LPA

Notice Period: Immediate joiners preferred

Key Responsibilities:

· Work on highly distributed and scalable system architecture

· Design, develop, test, and maintain high-quality software solutions

· Ensure performance, security, and maintainability of applications

· Collaborate with cross-functional teams and stakeholders

· Perform system testing and resolve technical issues


Required Skills:

· Strong experience in ASP.NET, C#, .NET Core, MVC

· Hands-on experience with SQL Server / PostgreSQL

· Experience in Angular / React (Frontend technologies)

· Knowledge of microservices architecture & RESTful APIs

· Familiarity with CQRS pattern

· Exposure to AWS / Docker / Kubernetes

· Experience with CI/CD pipelines (Azure DevOps, Jenkins)

· Knowledge of Node.js is an added advantage

· Understanding of Agile methodology

· Good exposure to cybersecurity and compliance


Technology Stack:

· Microsoft .NET technologies (primary)

· Cloud platforms: AWS (SaaS/PaaS/IaaS)

· Databases: MSSQL, MongoDB, PostgreSQL

· Caching: Redis, Memcached

· Messaging queues: RabbitMQ, Kafka, SQS

 

Read more
LearnTube.ai

at LearnTube.ai

2 candid answers
Vinayak Sharan
Posted by Vinayak Sharan
Remote, Mumbai
3 - 6 yrs
₹14L - ₹32L / yr
skill iconPython
FastAPI
skill iconDocker
skill iconAmazon Web Services (AWS)
SQL
+3 more

Role Overview:


As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.


Skip the wait and get noticed faster by completing our AI-powered screening. Click this link to start your quick interview. It only takes a few minutes and could be your shortcut to landing the job! -https://bit.ly/LT_Python


What You'll Do:


At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As a Backend Engineer, your roles and responsibilities will include:

  • Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
  • Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
  • Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
  • Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
  • Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
  • Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
  • Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
  • Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.


What makes you a great fit?


Must-Haves:

  • 3+ yrs Python back-end experience (FastAPI)
  • Strong with Docker & container orchestration
  • Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
  • SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals

Nice-to-Haves

  • k8s at scale, Terraform,
  • Experience with AI/ML inference services (LLMs, vector DBs)
  • Go / Rust for high-perf services
  • Observability: Prometheus, Grafana, OpenTelemetry


About Us: 


At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:

  • AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
  • Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.


Meet the Founders: 


LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.


Why Work With Us? 


At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:

  • Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
  • Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
  • Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
  • Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
  • Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
  • Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.
Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Mumbai, Bengaluru (Bangalore)
4 - 6 yrs
₹3L - ₹11L / yr
skill icon.NET
ASP.NET
skill iconC#
skill iconDocker
Microservices
+1 more

🚀 Hiring: .NET Develoepr at Deqode

⭐ Experience: 4+ Years

📍 Location: Mumbai and Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)



We are looking for a skilled .NET Developer to design and develop scalable microservices and enterprise-grade applications. The role involves building secure REST APIs, writing clean and testable code, working with Docker-based deployments, and collaborating with cross-functional teams.


Key Responsibilities:

  • Develop .NET Core microservices
  • Build and secure REST APIs
  • Write unit & integration tests
  • Deploy applications using Docker
  • Ensure performance optimization and code quality


3 Mandatory Skills

  1. .NET Core / ASP.NET Core Web API
  2. Microservices & Docker
  3. REST API development with Unit Testing





Read more
Mumbai, thane, Navi Mumbai
3 - 10 yrs
₹1L - ₹8L / yr
PLC
PLC Scada
SCADA
HMI
Pharmaceutics
+5 more

Engineer – Senior Level

Senior: Ghatkopar

Department: Automation / Programming


About the Opportunity

We are hiring Automation Engineers to work on end-to-end industrial automation projects in pharma

and food processing industries, involving PLC, HMI, and SCADA systems from design to

commissioning.

Qualification

Degree or Diploma in:

 Mechanical Engineering

 Electronics Engineering

 Instrumentation Engineering

 Electrical Engineering

Required Skills & Competencies

 Hands-on experience in PLC, HMI, and SCADA programming

 Knowledge of industrial automation in pharma/process industries

 Basic understanding of electrical & instrumentation wiring

 Ability to read and interpret technical drawings and schematics

 Experience in programming languages such as .NET, VB/VB.Net, SQL/T-SQL (preferred)

 Familiarity with AutoCAD Electrical, EPLAN, or similar tools (added advantage)

 Strong problem-solving and analytical skills

 Good communication and interpersonal skills

 Ability to work independently and within a team

 Flexible to travel and work extended hours when required

Key Responsibilities

 Program, test, and commission industrial control systems

 Select appropriate PLC, HMI, and SCADA systems based on customer URS

 Develop I/O lists as per P&ID and project requirements

 Design and implement control logic for automation projects

 Manage project timelines and ensure timely execution

 Coordinate with project managers on scope changes and updates

 Support FAT (Factory Acceptance Testing) and commissioning activities

 Interpret electrical schematics, wiring diagrams, and P&ID drawings

 Assist in troubleshooting electrical and instrumentation systems

 Ensure smooth project execution through effective coordination

Read more
Arcis India
Sarita Jena
Posted by Sarita Jena
Mumbai
6 - 8 yrs
₹12L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Quarkus
Microservices
Webservices
+17 more

6 + years of hands-on development experience and in-depth knowledge of , Spring Java, Spring boot, Quarkus and nice to have front-end technologies like Angular, React JS

● Excellent Engineering skills in designing and implementing scalable solutions

● Good knowledge of CI/CD Pipeline with strong focus on TDD

Strong communication skills and ownership

● Exposure to Cloud, Kubernetes, Docker, Microservices is highly desired.

● Experience in working on public cloud environments like AWS, Azure, GCP w.r.t. solutions development, deployment & adoption of cloud-based technology components like IaaS / PaaS offerings

● Proficiency in PL/SQL and Database development.

Strong in J2EE & OOPS Design Patterns.

Read more
Searce Inc

at Searce Inc

3 recruiters
Srishti Dani
Posted by Srishti Dani
Mumbai, Pune, Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Data migration
Datawarehousing
ETL
SQL
Google Cloud Platform (GCP)
+7 more

Lead Data Engineer


What are we looking for

real solver?

Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.


Your Responsibilities

What you will wake up to solve.

  • Lead Technical Design & Data Architecture: Architect and lead the end-to-end development of scalable, cloud-native data platforms. You’ll guide the squad on critical architectural decisions—choosing between Batch vs. Streaming or ETL vs. ELT—while remaining 100% hands-on, contributing high-quality, production-grade code.
  • Build High-Velocity Data Pipelines: Drive the implementation of robust data transports and ingestion frameworks using Python, SQL, and Spark. You will build integration layers that connect heterogeneous sources (SaaS, RDBMS, NoSQL) into unified, high-availability environments like BigQuery, Snowflake, or Redshift.
  • Mentor & Elevate the Squad: Foster a culture of technical excellence by mentoring and inspiring a team of data analysts and engineers. Lead deep-dive code reviews, promote best-practice data modeling (Star/Snowflake schema), and ensure the squad adopts modern engineering standards like CI/CD for data.
  • Drive AI-Ready Data Strategy: Be the expert in designing data foundations optimized for AI and Machine Learning. You will champion the use of GCP (Dataflow, Pub/Sub, BigQuery) and AWS (Lambda, Glue, EMR) to create "clean room" environments that fuel advanced analytics and generative AI models.
  • Partner with Clients as a Technical DRI: Act as the Directly Responsible Individual for client success. Translate ambiguous business questions into elegant data services, manage project deliverables using Agile methodologies, and ensure that the data provided is accurate, consistent, and mission-critical.
  • Troubleshoot & Optimize for Scale: Own the reliability of the reporting layer. You will proactively monitor pipelines, troubleshoot complex transformation bottlenecks, and propose ways to improve platform performance and cost-efficiency.
  • Innovate and Build Reusable IP: Spearhead the creation of reusable data frameworks, custom operators, and transformation libraries that accelerate future projects and establish Searce’s unique technical advantage in the market.


Welcome to Searce


The AI-Native tech consultancy that's rewriting the rules.

Searce is an AI-native, engineering-led, modern tech consultancy that empowers clients to futurify their business by delivering intelligent, impactful, real business outcomes. Searce solvers co-innovate with clients as their trusted transformational partners ensuring sustained competitive advantage. Searce clients realize smarter, faster, better business outcomes delivered by AI-native Searce solver squads. 


Functional Skills 

the solver personas.

  • The Data Architect: This persona deconstructs ambiguous business goals into scalable, elegant data blueprints. They don't just move data; they design the foundation—from schema design to partitioning strategies—that allows data scientists and analysts to thrive, foreseeing technical bottlenecks and making pragmatic trade-offs.
  • The Player-Coach: As a hands-on leader, this persona leads from the front by writing exemplary, production-grade SQL and Python while simultaneously mentoring and elevating the skills of the squad. Their success is measured by the team's ability to deliver high-quality, maintainable code and their growth as engineers.
  • The Pragmatic Innovator: This individual balances a passion for modern data tech (like Generative AI and Real-time Streaming) with a sharp focus on business outcomes. They champion new tools where they add real value but are disciplined enough to choose stable, cost-effective solutions to meet deadlines and deliver robust products.
  • The Client-Facing Technologist: This persona acts as the crucial technical bridge between the data squad and the client. They build trust by listening actively, explaining complex data concepts (like data latency or idempotency) in simple terms, and demonstrating how engineering decisions align with the client’s strategic goals.
  • The Quality Craftsman: This individual possesses an unwavering commitment to data integrity and treats data engineering as a craft. They are the guardian of the reporting layer, advocating for robust testing, data validation frameworks, and clean, modular code to ensure the long-term reliability of the data platform.


Experience & Relevance 

  • Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads.
  • Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.
  • AI-Native Workflow: You don’t just build for AI; you build with AI. You must be proficient in using AI coding assistants (e.g., GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative AI.
  • Architectural Portfolio: Evidence of leading 2-3 large-scale transformations—including platform migrations, data lakehouse builds, or real-time analytics architectures.
  • Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO’s business vision into a Lead Engineer’s technical specification without losing anything in translation.


Join the ‘real solvers’

ready to futurify?

If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’. Don’t Just Send a Resume. Send a Statement.


Read more
WeAssemble
Meghal Majithia
Posted by Meghal Majithia
Mumbai
3 - 6 yrs
₹5L - ₹8L / yr
Selenium
Playwright
SQL
Test Automation (QA)

We are looking for a highly skilled QA Automation Engineer with at least 3 years of experience to join our dynamic team in Mumbai. The ideal candidate should be proactive, detail-oriented, and ready to hit the ground running.


Company's Name:-WeAssemble

Reach US:- www.weassemble.team

Location:- One International Centre, Prabhadevi, Mumbai 

Working days:- Monday - Friday / Sat & Sun Fixed Off

Location: Prabhadevi , Mumbai

*Key Responsibilities:*

* Design, develop, and execute automated test scripts using industry-standard tools and frameworks.

* Collaborate with developers, business analysts, and product managers to ensure product quality.

* Conduct functional, non-functional, API, integration testing.

* Implement and maintain automation frameworks.

* Contribute to continuous improvement in QA processes.

*Required Skills & Experience:*

* Strong experience in Playwright with JavaScript.

* API Testing Automation (Postman, REST Assured, or equivalent).

* Hands-on experience with CI/CD pipelines (Jenkins, GitHub Actions, GitLab, or similar).

* Solid understanding of software QA methodologies, tools, and processes.

* Ability to identify, log, and track bugs effectively.

* Strong problem-solving and analytical skills.

*Good to Have:*

* Knowledge of performance testing tools.

* Familiarity with cloud platforms (AWS, Azure, or GCP).

Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Mumbai, Trivandrum
4 - 7 yrs
Best in industry
Google Cloud Platform (GCP)
SQL
ETL
Datawarehousing
Data-flow analysis

We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.


Key Responsibilities

  • Collaborate with business users and stakeholders to understand business processes and data requirements
  • Design and implement dimensional data models, including fact and dimension tables
  • Identify, design, and implement data transformation and cleansing logic
  • Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
  • Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
  • Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
  • Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
  • Provide high-level design, research, and effort estimates for data integration initiatives
  • Provide production support for ETL processes to ensure data availability and SLA adherence
  • Analyze and resolve data pipeline and performance issues
  • Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
  • Translate business requirements into well-defined technical data specifications
  • Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
  • Define and document BI usage through use cases, prototypes, testing, and deployment
  • Support and enhance data governance and data quality processes
  • Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
  • Train and support business users, IT analysts, and developers
  • Lead and collaborate with teams spread across multiple locations

Required Skills & Qualifications

  • Bachelor’s degree in Computer Science or a related field, or equivalent work experience
  • 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
  • Strong expertise in data warehousing concepts, tools, and best practices
  • Excellent SQL skills
  • Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
  • Hands-on experience with Google Cloud Platform (GCP) services, including:
  1. BigQuery
  2. Cloud SQL
  3. Cloud Composer (Airflow)
  4. Dataflow
  5. Dataproc
  6. Cloud Functions
  7. Google Cloud Storage (GCS)
  • Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
  • Strong experience integrating data using APIs, XML, JSON, and similar formats
  • In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
  • Solid understanding of SDLC, Agile, and Scrum methodologies
  • Strong problem-solving, multitasking, and organizational skills
  • Experience handling large-scale datasets and database design
  • Strong verbal and written communication skills
  • Experience leading teams across multiple locations

Good to Have

  • Experience with SSRS and SSIS
  • Exposure to AWS and/or Azure cloud platforms
  • Experience working with enterprise BI and analytics tools

Why Join Us

  • Opportunity to work on large-scale, enterprise data platforms
  • Exposure to modern cloud-native data engineering technologies
  • Collaborative environment with strong stakeholder interaction
  • Career growth and leadership opportunities
Read more
Service Based Company in Mohali and Noida

Service Based Company in Mohali and Noida

Agency job
via WITS Innovation Lab by Prabhnoor Kaur
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Mumbai
3 - 6 yrs
₹5L - ₹14L / yr
skill iconSpring Boot
Microservices
skill iconJava
RESTful APIs
SQL
+1 more

•      3+ years of hands-on experience developing and testing highly scalable software.

•      Excellent coding skills in Java 17 or above.

•      Very good understanding of any RDBMS and/or messaging queues

•      Proficient in Core java, Solid foundation in object-oriented development and design patterns.

•      Excellent problem-solving skills and attention to detail.

•      Ability to engineer complex features/systems from scratch and drive it to completion.

•      Good knowledge of multiple data storage systems.

•      Prior experience in micro services and event driven architecture.

•      Experience with Spring boot and Spring Security Framework

•      Spring web-flux understanding is desirable

•      Understand OWASP Top 10/CWE, DAST and SAST

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4.5 - 8 yrs
Best in industry
skill iconPython
SQL
FastAPI
Restapi
Artificial Intelligence (AI)
+4 more

Generative AI System Design

  • Architect and implement end-to-end LLM-powered applications
  • Build scalable RAG pipelines (chunking, embeddings, hybrid search, reranking)
  • Design and implement agent-based workflows (tool calling, multi-step reasoning, orchestration)
  • Integrate LLM APIs such as OpenAI and Anthropic, along with open-source models
  • Implement structured output validation, grounding strategies, and hallucination mitigation
  • Optimize inference cost, latency, and token efficiency
  • Design evaluation pipelines for performance, accuracy, and safety

2️⃣ Backend & Microservices Engineering

  • Design scalable backend systems using Python
  • Build REST and async APIs using FastAPI / Django
  • Architect and implement microservices with clear service boundaries
  • Implement service-to-service communication (REST, gRPC, event-driven messaging)
  • Work with message brokers (Kafka / RabbitMQ)
  • Optimize database performance (PostgreSQL, MongoDB)
  • Implement caching strategies (Redis)
  • Build observability: logging, monitoring, distributed tracing

3️⃣ Cloud-Native Architecture & DevOps

  • Design and deploy containerized services using Docker
  • Orchestrate services using Kubernetes
  • Implement CI/CD pipelines
  • Ensure system scalability, resilience, and fault tolerance
  • Apply distributed systems principles:
  • Circuit breakers
  • API gateway patterns
  • Load balancing
  • Horizontal scaling
  • Saga patterns
  • Zero-downtime deployments


Read more
Mumbai
2 - 5 yrs
₹8L - ₹10L / yr
skill iconPHP
SQL
API
skill iconLaravel
Information architecture

Software Engineer - Lending Platform

2 - 5 years Experience · Seed Stage · On-site preferred · Mumbai


What Neenv Is

Neenv is a fintech platform building channel finance infrastructure for MSME dealer networks in India. We sit between anchor companies and their dealer ecosystems, providing the credit technology layer while lending partners provide the capital.

The platform powers four supply chain finance products: Channel Financing, Working Capital Loans, Factoring, and Supplier Financing. The lending engine is configuration-driven. New products, rate changes, new anchors, new lenders -- config changes only.


What Problems Are We Solving

India runs on dealer networks. Hundreds of thousands of distributors, resellers, and stockists sit inside large corporate supply chains - buying from anchors, selling downstream, keeping markets liquid. These are creditworthy businesses. Their anchor relationships are essentially proof of cash flow. And yet they are chronically underfinanced.

Banks are too slow. Informal credit is expensive. The anchor relationship that makes a dealer viable is invisible to traditional lenders.

We are building the infrastructure to change that. A configuration-driven lending engine for channel finance - powering working capital credit to dealer networks at scale, with the anchor relationship as the underwriting signal.


Who You'll Be Working With

The founding team brings over 50 years of combined banking and channel finance experience. Founders with 25+ years each in client coverage, trade finance, risk management, and SCF sales across Standard Chartered and IDFC First Bank - having collectively managed over $1Bn in channel finance assets with sub-1% delinquency.

The CTO brings solid supply chain finance fintech experience with a product-first, AI-native approach to lending infrastructure.

You are not joining a first-time experiment. You are joining people who have spent careers building exactly what Neenv is now automating.


What Makes Your Role

We have a production lending infrastructure in place. It handles loan origination, repayment waterfalls, interest accrual, payment processing, ledger management, and multi-product configuration. You will own this platform end to end.

Understand the codebase end to end. Drive every config change, every extension, every integration. Be the person who can answer "can the system do X?" without waiting for anyone.

That is the first act.

The second act: we are building AI-native lending workflows. A credit decisioning agent that processes bureau reports, bank statements, GST data, and ITR. A collections agent that automates follow-up and escalation. Ops agents that handle accruals, month-end, lender reporting, and anomaly detection.

You will design this architecture from day one.


What Works Well Here

Someone who gets uncomfortable when they don't fully understand a system. Who reads error logs with curiosity. Who treats financial logic correctness as non-negotiable. Who can hold a product conversation and a technical conversation in the same breath.

If you have built something non-trivial and can explain every decision you made, that is the signal.


What You Need

  • PHP and Laravel -- solid working proficiency
  • Python -- working proficiency for AI agents, data processing, integrations
  • SQL and relational database design -- financial data where a paisa-level rounding error is a production bug
  • API design and third-party integration patterns -- REST, webhooks, handling flaky vendor APIs
  • LLM and agent workflows -- curiosity or working familiarity. Strong signal if you have built with Claude, GPT, or any agent framework
  • Fintech, NBFC, or any domain where data accuracy has real consequences


What We Are Offering

Fixed salary, competitive for early-stage fintech in Mumbai. Direct founder access. Ownership over a production lending system and the AI layer being built on top. For the right fit, a clear path to owning the entire technical stack as we scale.

We cannot offer a large team, defined career ladders, or a 500-person safety net. We can offer a genuinely hard problem, speed, and the chance to build something that matters from nearly the beginning.


Read more
Mumbai
8 - 12 yrs
₹20L - ₹25L / yr
SQL
Scripting
Active Directory
RBAC
JSON
+8 more

Role Overview

We are looking for a Saviynt-focused IAM professional at an architecture/engineering level with deep expertise in Identity Governance and Administration (IGA). The candidate will drive end-to-end Saviynt solution design, implementation, and optimization, ensuring scalable, secure, and compliant identity ecosystems across enterprise environments.

Key Responsibilities

  • Saviynt Architecture & Platform Engineering:
  • Design and implement scalable Saviynt architecture, including tenant setup, data model design, and performance optimization
  • Develop and manage advanced rules, workflows, and business logic within Saviynt
  • Drive platform customization, plugin development, and REST/API-based integrations
  • IGA Solution Design:
  • Architect and implement end-to-end IGA solutions including Access Request System (ARS), SoD (Segregation of Duties), and Certification/Recertification frameworks
  • Define RBAC models, entitlement governance strategies, and lifecycle management processes
  • Identity Integration & Ecosystem:
  • Lead integrations with enterprise applications, directories, and cloud platforms using connectors, APIs, and event-driven mechanisms
  • Work closely with cross-functional teams to enable application onboarding and automated provisioning
  • AD / Azure AD / Multi-Tenant Expertise:
  • Architect identity models across Active Directory (AD) and Azure Active Directory (AAD) environments
  • Design group structures, OU strategies, and identity lifecycle flows
  • Leverage Multi-Tenant Organization (MTO) capabilities for cross-tenant identity governance
  • Governance, Risk & Compliance:
  • Implement and optimize SoD policies, access certifications, and audit controls
  • Ensure compliance with security standards and regulatory frameworks
  • Automation & Optimization:
  • Enhance self-service capabilities, workflow automation, and access request efficiencies
  • Continuously improve performance, scalability, and operational stability of the Saviynt platform
  • Code Quality & Delivery Excellence:
  • Maintain high-quality code standards, documentation, and deployment practices
  • Support production environments, troubleshoot issues, and ensure platform reliability

Required Skills & Experience

  • 8+ years of hands-on experience in Saviynt IGA implementation and engineering
  • Strong expertise in: Saviynt EIC platform architecture & configuration; ARS, SoD, Recertification, RBAC; REST APIs, JSON, SQL, and scripting
  • Deep understanding of: Active Directory (AD) & Azure AD (AAD); Identity lifecycle management & provisioning workflows
  • Experience in enterprise integrations and large-scale deployments
  • Exposure to Multi-Tenant Organization (MTO) is a strong plus

Good to Have

  • Experience with other IAM tools (e.g., SailPoint, Okta)
  • Knowledge of cloud platforms (Azure, AWS)
  • Understanding of security frameworks (ISO, SOX, GDPR)


Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dharati Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
6 - 10 yrs
₹32L - ₹42L / yr
ETL
SQL
Google Cloud Platform (GCP)
Data engineering
ELT
+17 more

Role & Responsibilities:

We are looking for a strong Data Engineer to join our growing team. The ideal candidate brings solid ETL fundamentals, hands-on pipeline experience, and cloud platform proficiency — with a preference for GCP / BigQuery expertise.


Responsibilities:

  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows
  • Work with Dataform or DBT to implement transformation logic and data models
  • Develop and optimize data solutions on GCP (BigQuery, GCS) or AWS/Azure
  • Support data migration initiatives and data mesh architecture patterns
  • Collaborate with analysts, scientists, and business stakeholders to deliver reliable data products
  • Apply data governance and quality best practices across the data lifecycle
  • Troubleshoot pipeline issues and drive proactive monitoring and resolution


Ideal Candidate:

  • Strong Data Engineer Profile
  • Must have 6+ years of hands-on experience in Data Engineering, with strong ownership of end-to-end data pipeline development.
  • Must have strong experience in ETL/ELT pipeline design, transformation logic, and data workflow orchestration.
  • Must have hands-on experience with any one of the following: Dataform, dbt, or BigQuery, with practical exposure to data transformation, modeling, or cloud data warehousing.
  • Must have working experience on any cloud platform: GCP (preferred), AWS, or Azure, including object storage (GCS, S3, ADLS).
  • Must have strong SQL skills with experience in writing complex queries and optimizing performance.
  • Must have programming experience in Python and/or SQL for data processing.
  • Must have experience in building and maintaining scalable data pipelines and troubleshooting data issues.
  • Exposure to data migration projects and/or data mesh architecture concepts.
  • Experience with Spark / PySpark or large-scale data processing frameworks.
  • Experience working in product-based companies or data-driven environments.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.


NOTE:

  • There will be an interview drive scheduled on 28th and 29th March 2026, and if shortlisted, they will be expected to be available on these Interview dates. Only Immediate joiners are considered.
Read more
TalentXO
tabbasum shaikh
Posted by tabbasum shaikh
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
8 - 14 yrs
₹30L - ₹40L / yr
Data Modelling
SQL
Snowflake

Role & Responsibilities

drives large-scale data modernization and AI readiness for global enterprises. We are looking for an experienced Data Modeler to design, standardize, and maintain enterprise data models across our modernization initiatives — ensuring consistency, quality, and business alignment across cloud data platforms.

The person will be responsible for translating business requirements and data flows into robust conceptual, logical, and physical data models across multiple domains (Customer, Product, Finance, Supply Chain, etc.). You will work closely with Data Architects, Engineers, and Governance teams to ensure data is structured, traceable, and optimized for analytics and interoperability across platforms like Snowflake, Dremio, and Databricks.

Key Responsibilities-

  • Develop conceptual, logical, and physical data models aligned with enterprise architecture standards.
  • Engage with Business Stakeholders: Collaborate with business teams, business analysts and SMEs to understand business processes, data lifecycles, and key metrics that drive value and outcomes.
  • Value Chain Understanding: Analyze end-to-end customer and product value chains to identify critical data entities, relationships, and dependencies that should be represented in the data model.
  • Conceptual and Logical Modeling: Translate business concepts and data requirements into conceptual and logical data models that capture enterprise semantics and support analytical and operational needs.
  • Physical Data Modeling: Design and implement physical data models optimized for performance and scalability
  • Semantic Layer Design: Create semantic models that enable business access to data via BI tools and data discovery platforms.
  • Data Standards and Governance: Ensure models comply with enterprise data standards, naming conventions, lineage tracking, and governance practices.
  • Implement naming conventions, data standards, and metadata definitions across all models.
  • Collaboration with Data Engineering: Work closely with data engineers to align data pipelines with the logical and physical models, ensuring consistency and accuracy from ingestion to consumption.
  • Manage version control, lineage tracking, and change documentation for models.
  • Participate in data quality and governance initiatives to ensure trusted and consistent data definitions across domains.
  • Create and maintain a business glossary in collaboration with the governance team.

Ideal Candidate

  • Strong Enterprise Data Modeller profile (Modern Data Platforms)
  • Mandatory (Experience 1) – Must have 7+ years of experience in Data Modeling or Enterprise Data Architecture, with strong hands-on expertise in designing conceptual, logical, and physical data models for enterprise data platforms
  • Mandatory (Experience 2) – Must have Strong hands-on experience with enterprise data modeling tools such as Erwin, ER/Studio, PowerDesigner, SQLDBM, or similar enterprise data modeling tools
  • Mandatory (Experience 3) – Must have Deep understanding of dimensional modeling (Kimball / Inmon methodologies), normalization techniques, and schema design for modern data warehouse environments.
  • Mandatory (Experience 4) – Proven experience designing data models for modern data platforms such as Snowflake, Databricks, Redshift, Dremio, or similar cloud data warehouse / lakehouse systems.
  • Mandatory (Experience 5) – Must have strong SQL expertise and schema design skills, with the ability to validate data model implementations and collaborate closely with data engineering teams
  • Mandatory (Education) – Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related technical field.
  • Preferred (Experience 1) – Should have familiarity with data governance, metadata management, lineage, and business glossary tools such as Collibra, Alation, or Microsoft Purview.
  • Preferred (Experience 2) – Exposure to data integration pipelines and ETL frameworks such as Informatica, DBT, Airflow, or similar tools.
  • Preferred (Data Management) – Understanding of master data management (MDM) and reference data management principles.
  • Preferred (Domain) – Experience working with high-tech or manufacturing data domains, including customer, product, or supply chain data models


Read more
25 years old leading fintech company

25 years old leading fintech company

Agency job
via Apidel Technologies by Neha Dhoot
Mumbai
7 - 12 yrs
₹2L - ₹24L / yr
SQL


Full Stack Developer – React + Node.js + SQL + Team Lead + Excellent comms

Job Title: Full Stack Developer (React + Node.js + SQL)

Experience: 7+ years

Job Summary:

We are seeking a highly capable full stack developer proficient in ReactNode.js, and SQL. The ideal candidate will be responsible for developing scalable web applications and APIs, integrating with relational databases, and delivering high-quality, maintainable code.

Key Responsibilities:

  • Design and develop front-end interfaces using React with Redux/Context API.
  • Build RESTful APIs and backend services using Node.js (Express.js).
  • Integrate and optimize SQL queries with PostgreSQL, MySQL, or MS SQL Server.
  • Ensure responsive design and cross-browser compatibility.
  • Collaborate with UI/UX designers, testers, and backend engineers.
  • Write unit and integration tests to ensure code quality and reliability.

Required Skills:

  • Strong hands-on experience with React.js and Node.js.
  • Proficiency in SQL and experience with relational database systems.
  • Knowledge of RESTful API design and microservices architecture.
  • Familiarity with version control systems (Git), CI/CD pipelines.
  • Strong understanding of HTML, CSS, and JavaScript (ES6+).

Good to Have:

  • Experience with GraphQL, TypeScript, or NoSQL databases.
  • Cloud deployment experience (AWS/Azure).
  • Containerization knowledge (Docker/Kubernetes).
  • Agile/Scrum project methodology experience.


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Mumbai, Bengaluru (Bangalore)
4 - 6 yrs
₹4.5L - ₹12L / yr
skill icon.NET
SQL
skill iconC#
skill iconHTML/CSS
skill iconDocker
+1 more

🚀 Hiring: .NET Developer at Deqode

⭐ Experience: 4+ Years

📍 Location: Mumbai and Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


🚀 Hiring: .NET Developer

We are looking for a skilled .NET Developer to join our growing team. The ideal candidate should have strong experience in developing, testing, and maintaining applications using the .NET framework.


🎗️ Key Responsibilities:

✅ Develop and maintain web applications using .NET / .NET Core

✅ Write clean, scalable, and efficient code

✅ Troubleshoot, debug, and upgrade existing applications

✅ Work with databases and APIs for application integration


💫 Requirements:

✅ Experience with C#, ASP.NET, .NET Core

✅ Knowledge of SQL Server

✅ Familiarity with REST APIs

✅ Understanding of HTML, CSS, JavaScript

✅ Strong problem-solving and communication skills

Read more
Recruiting Bond

at Recruiting Bond

2 candid answers
Pavan Kumar
Posted by Pavan Kumar
Mumbai, Navi Mumbai
10 - 15 yrs
₹55L - ₹80L / yr
Distributed Systems
Systems design
Systems architecture
High-level design
LLD
+77 more

Location: Mumbai, Maharashtra, India

Sector: Technology, Information & Media

Company Size: 500 - 1,000 Employees

Employment: Full-Time, Permanent

Experience: 10 - 14 Years (Engineering Leadership)

Level: Engineering Manager / Group EM


ABOUT THIS MANDATE :


Recruiting Bond has been exclusively retained by one of India's most prominent and well-established digital platform organisations operating at the intersection of Technology, Information, and Media to identify and place an exceptional Engineering Manager who can lead engineering teams through an enterprise-wide AI adoption and digital transformation agenda.


This is a high-impact, hands-on leadership role at the nexus of people, product, and technology. The organisation is executing one of the most ambitious AI transformation programmes in its sector and this Engineering Manager will be a core driver of that change. You will lead multiple squads, own engineering delivery end-to-end, embed AI tooling and practices into the team's DNA, and shape the engineering culture of tomorrow.


We are seeking leaders who code when it matters, who build systems and teams with equal conviction, and who view AI not as a trend but as a fundamental shift in how great software is built.


THE OPPORTUNITY AT A GLANCE :


AI-First Engineering Culture :

  • Own AI adoption across your squads - from LLM tooling integration to automation-first delivery workflows. Make AI a default, not an afterthought.


Hands-On Engineering Leadership :

  • Stay close to the code. Lead architecture reviews, unblock engineers, and set the technical bar - not just the management agenda.


People & Org Builder :

  • Grow engineers into leaders. Build squads of 615 across functions. Drive hiring, career frameworks, and a culture of psychological safety.


KEY RESPONSIBILITIES :


1. Hands-On Technical Engagement :

  • Remain deeply embedded in the technical work participate in design reviews, architecture decisions, and critical code reviews
  • Set and uphold the engineering quality bar : performance benchmarks, security standards, test coverage, and release quality
  • Provide technical direction on backend platform strategy, API design, service decomposition, and data architecture
  • Identify and resolve systemic technical debt and architectural risks across team-owned services
  • Unblock engineers by diving into complex problems debugging, pair programming, and system analysis when it matters
  • Own key technical decisions in collaboration with Tech Leads and Principal Engineers; balance pragmatism with long-term sustainability


2. AI Adoption, Integration & Transformation (2026 Mandate) :

  • Define and execute the team's AI adoption roadmap - from developer tooling to product-facing AI features
  • Champion the integration of GenAI tools (GitHub Copilot, Cursor, Claude, ChatGPT) across the full engineering workflow coding, testing, documentation, incident response
  • Embed LLM-powered capabilities into the product : recommendation engines, intelligent search, conversational interfaces, content generation, and predictive systems
  • Lead evaluation and adoption of AI-assisted SDLC practices : automated code review, AI-generated test suites, intelligent observability, and anomaly detection
  • Partner with Data Science and ML Platform teams to productionise ML models with robust MLOps pipelines
  • Build team literacy in prompt engineering, RAG (Retrieval-Augmented Generation), and AI agent frameworks
  • Create an experimentation culture : run structured AI pilots, measure productivity impact, and scale what works
  • Stay ahead of the AI tooling landscape and advise senior leadership on strategic AI investments and engineering implications


3. People Leadership & Team Development :

  • Lead, manage, and grow squads of 6 - 15 engineers across seniority levels (L2 through L6 / Junior through Staff)
  • Conduct structured 1 : 1s, career growth conversations, and development planning with every direct report
  • Design and execute personalised AI upskilling programmes ensure every engineer develops practical AI fluency by end of 2026
  • Build and maintain a high-performance team culture : clarity of ownership, accountability, fast feedback loops, and psychological safety
  • Drive performance management fairly and rigorously recognise top performers, manage underperformance constructively
  • Lead technical hiring end-to-end : define job requirements, conduct bar-raising interviews, and make data-driven hire decisions
  • Contribute to engineering career frameworks and level definitions in partnership with the VP / Director of Engineering


4. Engineering Delivery & Execution Excellence :

  • Own end-to-end delivery for multiple product squads from planning and scoping through production release and post-launch stability
  • Implement and refine agile delivery frameworks (Scrum, Kanban, Shape Up) calibrated to squad needs and product cadence
  • Drive predictable delivery : maintain healthy sprint velocity, manage WIP limits, and ensure dependency resolution across teams.
  • Establish and own engineering KPIs : DORA metrics (deployment frequency, lead time, MTTR, change failure rate), uptime SLOs, and velocity trends
  • Lead incident management : build blameless post-mortem culture, own RCA processes, and drive systemic reliability improvements
  • Balance technical debt repayment with feature velocity negotiate prioritisation transparently with Product leadership


5. Strategic Leadership & Cross-Functional Influence :

  • Serve as the primary engineering partner for Product, Design, Data, and Business stakeholders translate ambiguity into executable engineering plans
  • Participate in quarterly roadmap planning, capacity forecasting, and OKR definition for engineering teams
  • Represent engineering in leadership forums articulate technical constraints, risks, and opportunities in business terms
  • Contribute to org-wide engineering strategy : platform investments, build-vs-buy decisions, and shared infrastructure priorities
  • Build relationships across geographies (Mumbai HQ + distributed teams) to maintain alignment and delivery cohesion
  • Act as a culture carrier and ambassador for engineering excellence, innovation, and responsible AI use


AI TRANSFORMATION LEADERSHIP 2026 EXPECTATIONS :


In 2026, Engineering Managers at this organisation are expected to be active architects of AI transformation not passive observers. The following outlines the specific AI leadership expectations for this role :


AI Developer Productivity

  • Drive measurable uplift in developer velocity through AI tooling adoption. Target : 30%+ reduction in code review cycle time and 40%+ increase in test coverage automation by Q3 2026.


LLM & GenAI Product Features

  • Own delivery of GenAI-powered product capabilities : intelligent content, semantic search, personalisation, and conversational UX in production, at scale.


AI-Augmented Observability

  • Implement AI-driven monitoring and anomaly detection pipelines. Reduce MTTR by leveraging predictive alerting, intelligent runbooks, and auto-remediation scripts.


Team AI Fluency :

  • Build mandatory AI literacy across all engineering levels.
  • Every engineer understands prompt engineering basics, AI ethics guardrails, and responsible AI deployment practices.


Responsible AI Governance :

  • Partner with Security, Legal, and Data Privacy to ensure all AI deployments meet compliance standards, bias mitigation requirements, and explainability benchmarks.


TECHNOLOGY STACK & DOMAIN FAMILIARITY REQUIRED :


  • Languages: Java/ Go/ Python/ Node.js /PHP /Rust (must be hands-on in at least 2)
  • Cloud: AWS / GCP / Azure (multi-cloud exposure strongly preferred)
  • AI & GenAI: OpenAI / Anthropic / Gemini APIs /LangChain /LlamaIndex / RAG / Vector DBs / GitHub
  • Copilot: Cursor /Hugging Face
  • Containers: Docker /Kubernetes /Helm /Service Mesh (Istio / Linkerd)
  • Databases: PostgreSQL /MongoDB / Redis / Cassandra / Elasticsearch / Pinecone (Vector DB)
  • Messaging: Apache Kafka /RabbitMQ /AWS SQS/SNS /Google Pub/Sub
  • MLOps & DataOps: MLflow /Kubeflow / SageMaker / Vertex AI /Airflow /dbt
  • Observability: Datadog /Prometheus /Grafana /OpenTelemetry / Jaeger /ELK Stack
  • CI/CD & IaC: GitHub Actions ArgoCD / Jenkins / Terraform /Ansible /Backstage (IDP)


QUALIFICATIONS & CANDIDATE PROFILE :

Education :

  • B.E. / B.Tech or M.E. / M.Tech from a Tier-I or Tier-II Institution - CS, IS, ECE, AI/ML streams strongly preferred
  • Demonstrated engineering depth and leadership impact may complement institution pedigree


Experience :

  • 10 to 14 years of progressive engineering experience, with at least 3 years in a formal Engineering Manager or equivalent people-leadership role
  • Proven track record of managing and scaling engineering teams (615+ engineers) in a fast-growing SaaS or digital product environment
  • Hands-on backend engineering background must be able to read, write, and critique production code
  • Direct experience driving AI/ML feature delivery or AI tooling adoption within engineering organisations
  • Exposure across start-up, mid-size, and large-scale product organisations, preferred adaptability is a core requirement
  • Strong CS fundamentals: distributed systems, algorithms, system design, and software architecture
  • Demonstrated career stability minimum of 2 years of average tenure per organisation.


The Ideal Engineering Manager in 2026 :

  • Leads with context, not control, empowers engineers while maintaining accountability and quality
  • Is fluent in both people language and technical language, switches registers naturally with engineers and executives alike
  • Sees AI as a force multiplier for the team, not a threat. Actively experiments with and advocates for AI tooling
  • Measures success by team outcomes, not personal output. Takes pride in what the team ships, not what they build alone
  • Creates feedback loops obsessively between product and engineering, between seniors and juniors, between metrics and decisions
  • Has strong opinions, loosely held, brings conviction to discussions but updates on evidence
  • Invests in engineering excellence as seriously as delivery velocity knows that quality and speed are not opposites


WHY THIS ROLE STANDS APART :


AI Transformation at Scale :

  • Lead one of the most significant AI adoption programmes in India's digital media sector.
  • Our decisions will shape how hundreds of engineers work in 2026 and beyond.


Hands-On & Strategic Balance :

  • A rare EM role that actively encourages technical depth.
  • Stay close to the code while owning the people agenda - the best of both worlds.


Established Platform, Real Scale :

  • 5001,000 engineers, proven product-market fit, and the org maturity to execute.
  • This is not a greenfield startup gamble it is a serious company with serious ambition.


Clear Leadership Growth Path :

  • A visible, direct path toward Director / VP of Engineering.
  • Senior leadership is invested in growing its next generation of technology executives.


Read more
Moolya Software Testing Private Limited
Mumbai
2 - 3 yrs
₹7L - ₹7.5L / yr
GUI
Software Testing (QA)
User Interface (UI) Design
SQL
Databases
  • Exploratory tester with 2–3 years of experience in software testing.
  • The candidate should be an expert in GUI and functional testing of web applications.
  • Good communication is a must and should be capable of collaborating with cross functional teams.
  • Should be self driven capable of handling responsibilities independently.
  • Should have good knowledge of SQL and Jira
  • Strong proficiency in Microsoft Excel is required for test analysis and reporting.
  • Should be able to understand application architecture to effectively design and execute test scenarios.
  • Experience with Playwright automation is an added advantage but not mandatory.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Pune, Mumbai, Bengaluru (Bangalore)
3 - 12 yrs
Best in industry
skill iconPython
pandas
Object Oriented Programming (OOPs)
SQL

JOB DESCRIPTION:


Location: Pune, Mumbai, Bangalore

Mode of Work : 3 days from Office


* Python : Strong expertise in data workflows and automation

* Pandas: For detailed data analysis and validation

* SQL: Querying and performing operations on Delta tables

* AWS Cloud: Compute and storage services

* OOPS concepts

Read more
Foyforyou
Mumbai
1 - 3 yrs
₹2L - ₹15L / yr
SQL
MS-Excel
Microsoft Excel
Operations management
skill iconData Analytics

Product Manager (Data & Operations)

Experience: 2+ years

Must-Have: Candidate must have prior experience in a product-based company

Role Summary

We are looking for a highly analytical Product Manager to drive business growth through data analysis, operational efficiency, and structured experimentation.

This role will focus on identifying growth opportunities, reducing operational inefficiencies, improving unit economics, and building strong reporting systems across the ecommerce and AI-led product ecosystem.

You will work closely with Engineering, Marketing, Catalog, Operations, Finance, and Data teams to ensure decisions are backed by data and execution is operationally strong.

Key ResponsibilitiesData Analysis & Business Insights

  • Own end-to-end funnel analysis (Impressions → CTR → ATC → Checkout → Purchase → Repeat)
  • Identify drop-offs, leakages, and revenue gaps using SQL, GA, Clevertap
  • Perform cohort analysis (new vs repeat, prepaid vs COD, personalized vs non-personalized users)
  • Track and improve core metrics:
  • Conversion Rate
  • GMV & Revenue
  • AOV
  • Repeat Rate
  • Cancellation & RTO %
  • Margin contribution
  • Build and maintain weekly/monthly dashboards for leadership visibility
  • Translate raw data into clear, actionable insights

Operational Excellence

  • Identify operational bottlenecks impacting conversion and fulfillment
  • Analyze cancellation drivers & reduce COD RTO risk
  • Improve payment success rates and checkout efficiency
  • Work with logistics teams to optimize delivery timelines
  • Collaborate with catalog & brand teams to improve SKU performance
  • Monitor inventory health, sell-through rate, and stock rotation
  • Drive pricing and margin optimization initiatives

Experimentation & Performance Improvement

  • Run structured A/B tests to improve funnel performance
  • Define clear hypotheses, success metrics, and impact measurement
  • Analyze experiment results and recommend rollouts
  • Build scalable processes for experimentation cadence

Cross-Functional Execution

  • Convert insights into PRDs and operational roadmaps
  • Partner with engineering for sprint-based delivery
  • Align marketing, catalog, and operations on metric ownership
  • Ensure every feature launch has measurable business KPIs

Must-Have Skills

  • Strong analytical mindset and comfort with large datasets
  • Advanced Excel / Google Sheets
  • Strong SQL proficiency (mandatory)
  • Experience with GA, Clevertap, Mixpanel or similar tools
  • Experience working on ecommerce funnels
  • Understanding of unit economics (GMV, margins, CAC, LTV)
  • Strong problem-solving and structured thinking

Bonus Skills

  • Experience in ecommerce marketplace or D2C
  • Experience working with logistics, payments, or inventory systems
  • Exposure to AI-led recommendation systems
  • Experience building business dashboards

What Success Looks Like (First 6 Months)

  • Clear dashboard visibility across all core business metrics
  • 10–15% improvement in funnel conversion
  • Reduction in cancellation & RTO rates
  • Improved operational turnaround time
  • Data-backed roadmap prioritization


Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dharati Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 17 yrs
₹34L - ₹45L / yr
Dremio
Data engineering
Business Intelligence (BI)
Tableau
PowerBI
+51 more

Review Criteria:

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Role & Responsibilities:

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Mumbai, Pune
3 - 6 yrs
Best in industry
skill iconPython
PySpark
pandas
SQL
ADF
+2 more

* Python (3 to 6 years): Strong expertise in data workflows and automation

* Spark (PySpark): Hands-on experience with large-scale data processing

* Pandas: For detailed data analysis and validation

* Delta Lake: Managing structured and semi-structured datasets at scale

* SQL: Querying and performing operations on Delta tables

* Azure Cloud: Compute and storage services

* Orchestrator: Good experience with either ADF or Airflow

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Mumbai
2 - 6 yrs
₹2L - ₹8L / yr
Linux/Unix
Linux administration
Apache
Apache Tomcat
JBoss
+6 more

Job Title : System Support Engineer – L1

Experience : 2.5+ Years

Location : Mumbai (Powai)

Shift : Rotational


Role Summary :

Provide first-level technical and functional support for enterprise applications and infrastructure. Handle user issues, troubleshoot systems, and ensure timely resolution while following support processes.


Key Responsibilities :

  • Provide phone/email support and own user issues end-to-end.
  • Log, track, and update tickets in Jira/Freshdesk.
  • Troubleshoot Linux/UNIX systems, web servers, and databases.
  • Escalate unresolved issues and communicate during downtimes.
  • Create knowledge base articles and support documentation.


Mandatory Skills :

Linux/UNIX administration, Apache/Tomcat/JBoss, basic SQL databases (MySQL/SQL Server/Oracle), scripting knowledge, and ticketing tools experience.


Preferred :

  • Banking/Financial Services domain exposure and client-site support experience.
  • Strong communication skills, customer-focused mindset, and willingness to work in rotational shifts are essential.
Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹12L / yr
skill iconData Analytics
SQL

Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.

Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.

Handling the challenges and breaking.

Must have Advanced Excel skills

Should have 3+ years of relevant experience

Should have Reporting + dashboard creation experience

Should have Database development & maintenance experience

Must have Strong communication for client interactions

Should have Ability to work independently

Willingness to work from client locati

Read more
Matchmaking platform

Matchmaking platform

Agency job
via Peak Hire Solutions by Dharati Thakkar
Mumbai
2 - 5 yrs
₹21L - ₹28L / yr
skill iconData Science
skill iconPython
Natural Language Processing (NLP)
MySQL
skill iconMachine Learning (ML)
+15 more

Review Criteria

  • Strong Data Scientist/Machine Learnings/ AI Engineer Profile
  • 2+ years of hands-on experience as a Data Scientist or Machine Learning Engineer building ML models
  • Strong expertise in Python with the ability to implement classical ML algorithms including linear regression, logistic regression, decision trees, gradient boosting, etc.
  • Hands-on experience in minimum 2+ usecaseds out of recommendation systems, image data, fraud/risk detection, price modelling, propensity models
  • Strong exposure to NLP, including text generation or text classification (Text G), embeddings, similarity models, user profiling, and feature extraction from unstructured text
  • Experience productionizing ML models through APIs/CI/CD/Docker and working on AWS or GCP environments
  • Preferred (Company) – Must be from product companies

 

Job Specific Criteria

  • CV Attachment is mandatory
  • What's your current company?
  • Which use cases you have hands on experience?
  • Are you ok for Mumbai location (if candidate is from outside Mumbai)?
  • Reason for change (if candidate has been in current company for less than 1 year)?
  • Reason for hike (if greater than 25%)?

 

Role & Responsibilities

  • Partner with Product to spot high-leverage ML opportunities tied to business metrics.
  • Wrangle large structured and unstructured datasets; build reliable features and data contracts.
  • Build and ship models to:
  • Enhance customer experiences and personalization
  • Boost revenue via pricing/discount optimization
  • Power user-to-user discovery and ranking (matchmaking at scale)
  • Detect and block fraud/risk in real time
  • Score conversion/churn/acceptance propensity for targeted actions
  • Collaborate with Engineering to productionize via APIs/CI/CD/Docker on AWS.
  • Design and run A/B tests with guardrails.
  • Build monitoring for model/data drift and business KPIs


Ideal Candidate

  • 2–5 years of DS/ML experience in consumer internet / B2C products, with 7–8 models shipped to production end-to-end.
  • Proven, hands-on success in at least two (preferably 3–4) of the following:
  • Recommender systems (retrieval + ranking, NDCG/Recall, online lift; bandits a plus)
  • Fraud/risk detection (severe class imbalance, PR-AUC)
  • Pricing models (elasticity, demand curves, margin vs. win-rate trade-offs, guardrails/simulation)
  • Propensity models (payment/churn)
  • Programming: strong Python and SQL; solid git, Docker, CI/CD.
  • Cloud and data: experience with AWS or GCP; familiarity with warehouses/dashboards (Redshift/BigQuery, Looker/Tableau).
  • ML breadth: recommender systems, NLP or user profiling, anomaly detection.
  • Communication: clear storytelling with data; can align stakeholders and drive decisions.


Read more
Wama Technology

at Wama Technology

2 candid answers
HR Wama
Posted by HR Wama
Mumbai
5 - 8 yrs
₹9L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconFlask
MySQL
skill iconPostgreSQL
+5 more

Job Title: Python Developer (5–8+ Years Experience)

Location: Mumbai (Onsite)

Experience: 5–8+ Years

Salary: ₹9,00,000 – ₹12,00,000 per Annum (depending on experience & skill set)

Employment Type: Full-time


Job Description

We are looking for an experienced Python Developer to join our growing team in Mumbai. The ideal candidate will have strong hands-on experience in Python development, building scalable backend systems, and working with databases and APIs.


Key Responsibilities

  • Design, develop, test, and maintain Python-based applications
  • Build and integrate RESTful APIs
  • Work with frameworks such as Django / Flask / FastAPI
  • Write clean, reusable, and efficient code
  • Collaborate with frontend developers, QA, and project managers
  • Optimize application performance and scalability
  • Debug, troubleshoot, and resolve technical issues
  • Participate in code reviews and follow best coding practices
  • Work with databases and ensure data security and integrity
  • Deploy and maintain applications in staging/production environments


Required Skills & Qualifications

  • 5–8+ years of hands-on experience in Python development
  • Strong experience with Django / Flask / FastAPI
  • Good understanding of REST APIs
  • Experience with MySQL / PostgreSQL / MongoDB
  • Familiarity with Git and version control workflows
  • Knowledge of OOP concepts and design principles
  • Experience with Linux-based environments
  • Understanding of basic security and performance optimization
  • AI tool integration: GitHub Copilot, Windsurf, Cursor, AIDE, etc
  • Ability to work independently as well as in a team


Good to Have (Preferred Skills)

  • Experience with AWS / cloud services
  • Knowledge of Docker / CI-CD pipelines
  • Good level understanding of prompt engineering
  • Exposure to Microservices Architecture
  • Basic frontend knowledge (HTML, CSS, JavaScript)
  • Experience working in an Agile/Scrum environment
  • Experience working with AI APIs such as ChatGPT, OpenAI, Gemini, Claude APIs
  • Integrating AI APIs into web applications
  • Experience using AI for automation, content generation, data processing, or workflow optimization


Experience:

  • Total: 5+ years (Required)
  • Python: 5 years (Required)
Read more
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Agency job
via hirezyai by HR Hirezyai
Bengaluru (Bangalore), Chennai, Mumbai
5.5 - 12 yrs
₹15L - ₹25L / yr
skill iconAmazon Web Services (AWS)
PySpark
SQL

Roles & Responsibilities

  • Data Engineering Excellence: Design and implement data pipelines using formats like JSON, Parquet, CSV, and ORC, utilizing batch and streaming ingestion.
  • Cloud Data Migration Leadership: Lead cloud migration projects, developing scalable Spark pipelines.
  • Medallion Architecture: Implement Bronze, Silver, and gold tables for scalable data systems.
  • Spark Code Optimization: Optimize Spark code to ensure efficient cloud migration.
  • Data Modeling: Develop and maintain data models with strong governance practices.
  • Data Cataloging & Quality: Implement cataloging strategies with Unity Catalog to maintain high-quality data.
  • Delta Live Table Leadership: Lead the design and implementation of Delta Live Tables (DLT) pipelines for secure, tamper-resistant data management.
  • Customer Collaboration: Collaborate with clients to optimize cloud migrations and ensure best practices in design and governance.

Educational Qualifications

  • Experience: Minimum 5 years of hands-on experience in data engineering, with a proven track record in complex pipeline development and cloud-based data migration projects.
  • Education: Bachelor’s or higher degree in Computer Science, Data Engineering, or a related field.
  • Skills
  • Must-have: Proficiency in Spark, SQL, Python, and other relevant data processing technologies. Strong knowledge of Databricks and its components, including Delta Live Table (DLT) pipeline implementations. Expertise in on-premises to cloud Spark code optimization and Medallion Architecture.

Good to Have

  • Familiarity with AWS services (experience with additional cloud platforms like GCP or Azure is a plus).

Soft Skills

  • Excellent communication and collaboration skills, with the ability to work effectively with clients and internal teams.
  • Certifications
  • AWS/GCP/Azure Data Engineer Certification.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4 - 7 yrs
Best in industry
skill iconPython
pandas
NumPy
SQL
skill iconHTML/CSS
+4 more

Specific Knowledge/Skills


  1. 4-6 years of experience
  2. Proficiency in Python programming.
  3. Basic knowledge of front-end development.
  4. Basic knowledge of Data manipulation and analysis libraries
  5. Code versioning and collaboration. (Git)
  6. Knowledge for Libraries for extracting data from websites.
  7. Knowledge of SQL and NoSQL databases
  8. Familiarity with RESTful APIs
  9. Familiarity with Cloud (Azure /AWS) technologies
Read more
Foyforyou
Hardika Bhansali
Posted by Hardika Bhansali
Mumbai
2 - 6 yrs
₹2L - ₹15L / yr
SQL
MS-Excel
Business-to-consumer marketing
Electronic commerce

Experience: 2+ years Must-Have: Candidate must have prior experience in a product-based company

Role Summary: We are looking for a passionate Product Manager / APM to own and enhance the end-to-end product experience for both FOY Store (India & Global) and Personalise Me (Skin AI & Makeup Try-On). You will drive conversion, revenue, personalization, customer experience, and operational efficiency while collaborating closely with cross-functional teams including engineering, marketing, cataloge, operations, and data/ML teams.

Key Responsibilities: FOY Store:

Own the full customer journey: CTR → ATC → Checkout → Purchase → Repeat Define assortment strategy, navigation, product discovery, search, filters, PLPs, PDPs Collaborate with brand, cataloge, marketing, and operations for pricing, availability, and content accuracy Run rapid A/B experiments to optimize funnel and conversion Build scalable product integrations with payments, logistics, loyalty, and subscriptions Define product roadmap and write PRDs / user stories for engineering Track and improve store GMV, margins, retention, cancellations, COD risk Personalise Me (Skin AI + Makeup Try-On):

Own the hyper-personalized beauty experience: Skin AI test, Virtual Try-On, BeautyGPT Collaborate with data/ML teams to improve recommendation accuracy Understand beauty user profiles, concerns, and preferences deeply Integrate personalized recommendations into the shopping journey to boost conversion Drive metrics: activation → profile completion → recommendation clicks → purchase Work with brand and catalog teams to tag inventory for personalization

Must-Have Skills:

Strong analytical mindset + customer psychology understanding UI/UX intuition for ecommerce and personalization best practices Strong Google Sheets & Excel skills SQL proficiency Experience with funnels, Clevertap/GA, AB testing tools Customer empathy, problem-solving, and curiosity for beauty tech and AI

Bonus Skills:

Experience with ecommerce marketplaces, D2C, or AI-driven recommendation systems Experience with personalization, gamification, or form-based flows Knowledge of AI tools and product integrations

Why Join Us: Be part of a dynamic team shaping the future of beauty commerce, blending cutting-edge AI with customer-first product experiences.

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Indore, Pune, Bhopal, Mumbai, Nagpur, Kolkata, Bengaluru (Bangalore), Chennai
4 - 6 yrs
₹4.5L - ₹18L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL

🚀 Hiring: Java Developer at Deqode

⭐ Experience: 4+ Years

📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


Requirements

✅ Strong proficiency in Java (Java 8/11/17)

✅ Experience with Spring / Spring Boot

✅ Knowledge of REST APIs, Microservices architecture

✅ Familiarity with SQL/NoSQL databases

✅ Understanding of Git, CI/CD pipelines

✅ Problem-solving skills and attention to detail


Read more
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dharati Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data engineering
Data architecture
SQL
Data modeling
GCS
+47 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
SQL

Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.

Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Read more
Matchmaking platform

Matchmaking platform

Agency job
via Peak Hire Solutions by Dharati Thakkar
Mumbai
2 - 5 yrs
₹15L - ₹28L / yr
skill iconData Science
skill iconPython
Natural Language Processing (NLP)
MySQL
skill iconMachine Learning (ML)
+15 more

Review Criteria

  • Strong Data Scientist/Machine Learnings/ AI Engineer Profile
  • 2+ years of hands-on experience as a Data Scientist or Machine Learning Engineer building ML models
  • Strong expertise in Python with the ability to implement classical ML algorithms including linear regression, logistic regression, decision trees, gradient boosting, etc.
  • Hands-on experience in minimum 2+ usecaseds out of recommendation systems, image data, fraud/risk detection, price modelling, propensity models
  • Strong exposure to NLP, including text generation or text classification (Text G), embeddings, similarity models, user profiling, and feature extraction from unstructured text
  • Experience productionizing ML models through APIs/CI/CD/Docker and working on AWS or GCP environments
  • Preferred (Company) – Must be from product companies

 

Job Specific Criteria

  • CV Attachment is mandatory
  • What's your current company?
  • Which use cases you have hands on experience?
  • Are you ok for Mumbai location (if candidate is from outside Mumbai)?
  • Reason for change (if candidate has been in current company for less than 1 year)?
  • Reason for hike (if greater than 25%)?

 

Role & Responsibilities

  • Partner with Product to spot high-leverage ML opportunities tied to business metrics.
  • Wrangle large structured and unstructured datasets; build reliable features and data contracts.
  • Build and ship models to:
  • Enhance customer experiences and personalization
  • Boost revenue via pricing/discount optimization
  • Power user-to-user discovery and ranking (matchmaking at scale)
  • Detect and block fraud/risk in real time
  • Score conversion/churn/acceptance propensity for targeted actions
  • Collaborate with Engineering to productionize via APIs/CI/CD/Docker on AWS.
  • Design and run A/B tests with guardrails.
  • Build monitoring for model/data drift and business KPIs


Ideal Candidate

  • 2–5 years of DS/ML experience in consumer internet / B2C products, with 7–8 models shipped to production end-to-end.
  • Proven, hands-on success in at least two (preferably 3–4) of the following:
  • Recommender systems (retrieval + ranking, NDCG/Recall, online lift; bandits a plus)
  • Fraud/risk detection (severe class imbalance, PR-AUC)
  • Pricing models (elasticity, demand curves, margin vs. win-rate trade-offs, guardrails/simulation)
  • Propensity models (payment/churn)
  • Programming: strong Python and SQL; solid git, Docker, CI/CD.
  • Cloud and data: experience with AWS or GCP; familiarity with warehouses/dashboards (Redshift/BigQuery, Looker/Tableau).
  • ML breadth: recommender systems, NLP or user profiling, anomaly detection.
  • Communication: clear storytelling with data; can align stakeholders and drive decisions.




Read more
Navi Mumbai
4 - 8 yrs
₹8L - ₹10L / yr
Oracle SQL Developer
MySQL
ETL
Database Design
SQL
+1 more

Company Name : Enlink Managed Services

Company Website : https://enlinkit.com/

Location : Turbhe , Navi Mumbai

Shift Time : 12 pm to 9:30 pm

Working Days : 5 Days Working(Sat-Sun Fixed Off)

SQL Developer 

Roles & Responsibilities :

Designing Database, writing stored procedures, complex and dynamic queries in SQL

Creating Indexes, Views, complex Triggers, effective Functions, and appropriate store procedures to facilitate efficient data manipulation and data consistency

Implementing database architecture, ETL and development activities

Troubleshooting data load, ETL and application support related issues

Demonstrates ability to communicate effectively in both technical and business environments

Troubleshooting failed batch jobs, correcting outstanding issues and resubmitting scheduled jobs to ensure completion

Troubleshoot, optimize, and tune SQL processes and complex SQL queries

Required Qualifications/Experience

4+ years of experience in the design and optimization of MySQL databases

General database development using MySQL

Advanced level of writing stored procedures, reading query plans, tuning indexes and troubleshooting performance bottlenecks

Troubleshoot, optimize, and tune SQL processes and complex SQL queries

Experienced and versed in creating sophisticated MySQL Server databases to quickly handle complex queries

Problem solving, analytical and fluent communication

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dharati Thakkar
Bengaluru (Bangalore), Mangalore, Pune, Mumbai
3 - 5 yrs
₹8L - ₹11L / yr
skill iconData Analytics
SQL
Relational Database (RDBMS)
skill iconJava
skill iconPython
+1 more

Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills

 

Review Criteria

· Must have Strong SQL skills (queries, optimization, procedures, triggers)

· Must have Advanced Excel skills

· Should have 3+ years of relevant experience

· Should have Reporting + dashboard creation experience

· Should have Database development & maintenance experience

· Must have Strong communication for client interactions

· Should have Ability to work independently

· Willingness to work from client locations.

 

Description

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?

As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?

For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations

 

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dharati Thakkar
Kochi (Cochin), Mumbai, Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹17L / yr
Project Management
skill iconData Analytics
Program Management
SQL
Client Management
+7 more

Required Skills: Project Management, Data Analysis, SQL queries, Client Engagement

 

Criteria:

  • Must have 3+ years of project/program management experience in Financial Services/Banking/NBFC/Fintech companies only.
  • Hands-on proficiency in data analysis and SQL querying, with ability to work on large datasets
  • Ability to lead end-to-end implementation projects and manage cross-functional teams effectively.
  • Experience in process analysis, optimization, and mapping for operational efficiency.
  • Strong client-facing communication and stakeholder management capabilities.
  • Good expertise in financial operations processes and workflows with proven implementation experience.

 

Description

Position Overview:

We are seeking a dynamic and experienced Technical Program Manager to join our team. The successful candidate will be responsible for managing the implementation of company’s solutions at existing and new clients. This role requires a deep understanding of financial operation processes, exceptional problem-solving skills, and the ability to analyze large volumes of data. The Technical Program manager will drive process excellence and ensure outstanding customer satisfaction throughout the implementation lifecycle and beyond.

 

Key Responsibilities:

● Client Engagement: Serve as the primary point of contact for assigned clients, understanding their unique operation processes and requirements. Build and maintain strong relationships to facilitate successful implementations.

● Project Management: Lead the end-to-end implementation of company’s solutions, ensuring projects are delivered on time, within scope, and within budget. Coordinate with cross-functional teams to align resources and objectives.

● Process Analysis and Improvement: Evaluate clients' existing operation workflows, identify inefficiencies, and recommend optimized processes leveraging company’s platform. Utilize process mapping and data analysis to drive continuous improvement.

● Data Analysis: Analyze substantial datasets to ensure accurate configuration and integration of company’s solutions. Employ statistical tools and SQL-based queries to interpret data and provide actionable insights.

● Problem Solving: Break down complex problems into manageable components, developing effective solutions in collaboration with clients and internal teams.

● Process Excellence: Advocate for and implement best practices in process management, utilizing methodologies such as Lean Six Sigma to enhance operational efficiency.

● Customer Excellence: Ensure a superior customer experience by proactively addressing client needs, providing training and support, and promptly resolving any issues that arise.

 

Qualifications:

● Minimum of 3 years of experience in project management, preferably in financial services, software implementation, consulting or analytics.

● Strong analytical skills with experience in data analysis, SQL querying, and handling large datasets.

● Excellent communication and interpersonal skills, with the ability to manage client relationships effectively.

● Demonstrated ability to lead cross-functional teams and manage multiple projects concurrently.

● Proven expertise in financial operation processes and related software solutions is a plus

● Proficiency in developing business intelligence solutions or with low-code tools is a plus

 

Why Join company?

● Opportunity to work with a cutting-edge financial technology company.

● Collaborative and innovative work environment.

● Competitive compensation and benefits package.

● Professional development and growth opportunities.

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Mumbai, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Indore, Bengaluru (Bangalore)
4 - 7 yrs
₹4L - ₹10L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
Hibernate (Java)

Job Description

Role: Java Developer

Location: PAN India

Experience:4+ Years

Required Skills -

  1. 3+ years Java development experience
  2. Spring Boot framework expertise (MANDATORY)
  3. Microservices architecture design & implementation (MANDATORY)
  4. Hibernate/JPA for database operations (MANDATORY)
  5. RESTful API development (MANDATORY)
  6. Database design and optimization (MANDATORY)
  7. Container technologies (Docker/Kubernetes)
  8. Cloud platforms experience (AWS/Azure)
  9. CI/CD pipeline implementation
  10. Code review and quality assurance
  11. Problem-solving and debugging skills
  12. Agile/Scrum methodology
  13. Version control systems (Git)


Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dharati Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data architecture
Data engineering
SQL
Data modeling
GCS
+21 more

Review Criteria

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred

  • Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Dremio?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate

  • Bachelor’s or master’s in computer science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.
Read more
Banking Industry

Banking Industry

Agency job
via Jobdost by Saida Pathan
Mangalore, Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹10L / yr
SQL
Dashboard
skill iconData Analytics
Database Development

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?


As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?


For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations


Read more
shaadi.com

at shaadi.com

3 recruiters
Agency job
via hirezyai by Aardra Suresh
Mumbai
2 - 8 yrs
₹24L - ₹30L / yr
skill iconMachine Learning (ML)
skill iconPython
SQL
Neural networks

What We’re Looking For

  • 3-5 years of Data Science & ML experience in consumer internet / B2C products.
  • Degree in Statistics, Computer Science, or Engineering (or certification in Data Science).
  • Machine Learning wizardry: recommender systems, NLP, user profiling, image processing, anomaly detection.
  • Statistical chops: finding meaningful insights in large data sets.
  • Programming ninja: R, Python, SQL + hands-on with Numpy, Pandas, scikit-learn, Keras, TensorFlow (or similar).
  • Visualization skills: Redshift, Tableau, Looker, or similar.
  • A strong problem-solver with curiosity hardwired into your DNA.
  • Brownie Points
  • Experience with big data platforms: Hadoop, Spark, Hive, Pig.
  • Extra love if you’ve played with BI tools like Tableau or Looker.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
ASP.NET
SQL
skill iconAmazon Web Services (AWS)

Company Name – Wissen Technology

Location :  Pune / Bangalore / Mumbai (Based on candidate preference)

Work mode: Hybrid 

Experience: 5+ years


Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.


Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Education

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.


Read more
Navi Mumbai, Pune, Bengaluru (Bangalore), Hyderabad, Mohali, Panchkula, Dehradun, Gurugram, Chennai
5 - 9 yrs
₹8L - ₹14L / yr
skill icon.NET
skill iconMongoDB
Entity Framework
skill iconC#
SQL
+4 more

Job Title: Mid-Level .NET Developer (Agile/SCRUM)


Location: Mohali, Bangalore, Pune, Navi Mumbai, Chennai, Hyderabad, Panchkula, Gurugram (Delhi NCR), Dehradun


Night Shift from 6:30 pm to 3:30 am IST


Experience: 5+ Years


Job Summary:

We are seeking a proactive and detail-oriented Mid-Level .NET Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-quality applications using Microsoft technologies with a strong emphasis on .NET Core, C#, Web API, and modern front-end frameworks. You will collaborate with cross-functional teams in an Agile/SCRUM environment and participate in the full software development lifecycle—from requirements gathering to deployment—while ensuring adherence to best coding and delivery practices.


Key Responsibilities:

  • Design, develop, and maintain applications using C#, .NET, .NET Core, MVC, and databases such as SQL Server, PostgreSQL, and MongoDB.
  • Create responsive and interactive user interfaces using JavaScript, TypeScript, Angular, HTML, and CSS.
  • Develop and integrate RESTful APIs for multi-tier, distributed systems.
  • Participate actively in Agile/SCRUM ceremonies, including sprint planning, daily stand-ups, and retrospectives.
  • Write clean, efficient, and maintainable code following industry best practices.
  • Conduct code reviews to ensure high-quality and consistent deliverables.
  • Assist in configuring and maintaining CI/CD pipelines (Jenkins or similar tools).
  • Troubleshoot, debug, and resolve application issues effectively.
  • Collaborate with QA and product teams to validate requirements and ensure smooth delivery.
  • Support release planning and deployment activities.


Required Skills & Qualifications:

  • 4–6 years of professional experience in .NET development.
  • Strong proficiency in C#, .NET Core, MVC, and relational databases such as SQL Server.
  • Working knowledge of NoSQL databases like MongoDB.
  • Solid understanding of JavaScript/TypeScript and the Angular framework.
  • Experience in developing and integrating RESTful APIs.
  • Familiarity with Agile/SCRUM methodologies.
  • Basic knowledge of CI/CD pipelines and Git version control.
  • Hands-on experience with AWS cloud services.
  • Strong analytical, problem-solving, and debugging skills.
  • Excellent communication and collaboration skills.


Preferred / Nice-to-Have Skills:

  • Advanced experience with AWS services.
  • Knowledge of Kubernetes or other container orchestration platforms.
  • Familiarity with IIS web server configuration and management.
  • Experience in the healthcare domain.
  • Exposure to AI-assisted code development tools (e.g., GitHub Copilot, ChatGPT).
  • Experience with application security and code quality tools such as Snyk or SonarQube.
  • Strong understanding of SOLID principles and clean architecture patterns.


Technical Proficiencies:

  • ASP.NET Core, ASP.NET MVC
  • C#, Entity Framework, Razor Pages
  • SQL Server, MongoDB
  • REST API, jQuery, AJAX
  • HTML, CSS, JavaScript, TypeScript, Angular
  • Azure Services, Azure Functions, AWS
  • Visual Studio
  • CI/CD, Git


Read more
Mumbai
4 - 8 yrs
₹10L - ₹17L / yr
skill icon.NET
skill iconC#
RMS
Financial risk management
Risk Management
+1 more

Job Description: Senior Software Engineer – C# (RMS – Risk Management Systems)

**Location:** [Insert Location]

**Job Type:** [Full-time / Contract]

**Experience:** 4–8 years

**Domain:** Capital Markets / Risk Management / Trading Applications

Job Description:

We are looking for an experienced Senior Software Engineer with deep expertise in C# and distributed systems, to design and maintain mission-critical Risk Management Systems (RMS) used in trading environments. The role requires strong understanding of real-time order flow, risk checks, queue management, and multi-threaded processing.

Key Responsibilities:

RMS Development:

·       Design, develop, and optimize real-time RMS components using C# and .NET Framework (4.0/4.7.2).

·       Implement rule-based and exposure-based pre-trade and post-trade risk checks.

·       Develop in-memory data structures to handle millions of order and trade records efficiently.

·       Build high-throughput queues and modules to handle burst loads during market open and spikes.

·       Debug multi-threaded modules and ensure accurate and timely risk validation.

·       Build alerting, threshold evaluation, and notification modules for risk violations.

·       Collaborate with product and trading teams to translate risk rules into executable modules.

Tools & Technologies:

·       Version control: Git or TFS.

·       Database: SQL Server or in-memory cache (Redis) for real-time exposure tracking.

·       Experience with messaging systems or queues (e.g., MSMQ, ZeroMQ, Kafka) preferred.

·       Proficiency with AI-powered tools such as GitHub Copilot and ChatGPT.

·       Prompt engineering skills to utilize AI for test case generation, debugging, and optimization.

Domain Knowledge (Must-Have):

·       Strong understanding of capital markets, especially equity and derivative segments.

·       Working knowledge of Order Management Systems (OMS), RMS policies, and market behavior.

·       Experience with exchange protocols (e.g., FIX, TCP) and market data processing.

·       Ability to handle peak load conditions and large-scale order bursts.

Preferred Qualifications:

·       Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

·       Prior experience working on RMS or surveillance systems in broking or exchange domain.

·       Familiarity with trading APIs and pre-trade/post-trade workflows.

Read more
Wissen Technology
Mumbai, Pune
5 - 9 yrs
₹10L - ₹20L / yr
Functional testing
Integration testing
Oracle Fusion
SQL
E2E

Key Responsibilities:

  • Perform comprehensive Functional and Integration Testing across Oracle modules and connected systems.
  • Conduct detailed End-to-End (E2E) Testing to ensure business processes function seamlessly across applications.
  • Collaborate with cross-functional teams, including Business Analysts, Developers, and Automation teams, to validate business requirements and deliver high-quality releases.
  • Identify, document, and track functional defects, ensuring timely closure and root cause analysis.
  • Execute and validate SQL queries for backend data verification and cross-system data consistency checks.
  • Participate in regression cycles and support continuous improvement initiatives through data-driven analysis.

Required Skills & Competencies:

  • Strong knowledge of Functional Testing processes and methodologies.
  • Good to have Oracle fusion knowledge
  • Solid understanding of Integration Flows between Oracle and peripheral systems.
  • Proven ability in E2E Testing, including scenario design, execution, and defect management.
  • Excellent Analytical and Logical Reasoning skills with attention to detail.
  • Hands-on experience with SQL for data validation and analysis.
  • Effective communication, documentation, and coordination skills.

Preferred Qualifications:

  • Exposure to automation-assisted functional testing and cross-platform data validation.
  • Experience in identifying test optimization opportunities and improving testing efficiency.


Read more
Enpointeio
sanath shetty
Posted by sanath shetty
Mumbai
2 - 5 yrs
₹8L - ₹12L / yr
skill iconNodeJS (Node.js)
TypeScript
SQL
skill iconAmazon Web Services (AWS)

Position Overview

We're seeking a skilled Full Stack Developer to build and maintain scalable web applications using modern technologies. You'll work across the entire development stack, from database design to user interface implementation.


Key Responsibilities

  • Develop and maintain full-stack web applications using Node.js and TypeScript
  • Design and implement RESTful APIs and microservices
  • Build responsive, user-friendly front-end interfaces
  • Design and optimize SQL databases and write efficient queries
  • Collaborate with cross-functional teams on feature development
  • Participate in code reviews and maintain high code quality standards
  • Debug and troubleshoot application issues across the stack

Required Skills

  • Backend: 3+ years experience with Node.js and TypeScript
  • Database: Proficient in SQL (PostgreSQL, MySQL, or similar)
  • Frontend: Experience with modern JavaScript frameworks (React, Vue, or Angular)
  • Version Control: Git and collaborative development workflows
  • API Development: RESTful services and API design principles

Preferred Qualifications

  • Experience with cloud platforms (AWS, Azure, or GCP)
  • Knowledge of containerization (Docker)
  • Familiarity with testing frameworks (Jest, Mocha, or similar)
  • Understanding of CI/CD pipelines

What We Offer

  • Competitive salary and benefits
  • Flexible work arrangements
  • Professional development opportunities
  • Collaborative team environment


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort