Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
ARDEM Incorporated
Remote only
8 - 12 yrs
₹9L - ₹11L / yr
skill iconPython
skill icon.NET
skill iconJavascript
skill iconNodeJS (Node.js)
SQL
+11 more

Senior Project Owner / Project Manager Technology


Department - Technology / Software Development

Work Mode - Work From Home (WFH), Full Time

Experience - Minimum 10 Years (Development Background)

Location - Tier-1 Cities Only (Mumbai, Delhi, Bengaluru, Hyderabad, Chennai, Pune, Kolkata)

Time Zone - Candidate should be comfortable working in US time zone overlap and attending client calls accordingly.


ABOUT ARDEM

ARDEM Incorporated is a leading Business Process Outsourcing (BPO) and Automation company serving USbased clients across diverse industries. Our Technology Team builds and maintains in-house applications that power data processing pipelines, automation workflows, internal platforms, and domain-specific training modules all engineered to deliver operational excellence at scale. To our clients, we provide cloud-based platforms to assist in their day-to-day business analytics. Our cloud services focus on finance, logistics and utility management.


ROLE SUMMARY

We are looking for a seasoned Senior Project Owner / Project Manager with a strong development foundation to lead our technology initiatives. This role bridges client management and technical execution you will own endto-end delivery of multiple concurrent projects while supporting a high-performing remote team.


KEY RESPONSIBILITIES

Project & Delivery Management

  • Own and manage multiple concurrent technology projects from initiation to production release
  • Define project scope, timelines, milestones, and resource allocation plans
  • Distribute tasks effectively across a team of developers, QA, and support engineers
  • Track assigned work daily, follow up on progress, and proactively remove blockers
  • Ensure all projects meet deadlines and quality benchmarks without compromise
  • Participate actively in production activities and take full accountability for live deployments


US Client Management

  • Serve as the Technology single point of contact for all assigned US clients
  • Attend and lead client calls that are focused on an ARDEM Technical Solution. This may include discussions related to future clients or existing clients (US time zone overlap required)
  • Resolve client queries, manage escalations, and ensure high client satisfaction
  • Showcase company-developed applications and software demos confidently to clients
  • Translate complex client requirements into clear technical deliverables for the team


Team Leadership

  • Lead, mentor, and performance-manage a distributed remote team of technical members
  • Foster accountability, ownership, and a high-delivery culture within the team
  • Conduct sprint planning, stand-ups, retrospectives, and performance reviews
  • Identify skill gaps and work with HR/training teams to bridge them


Process & Operations

  • Deeply understand ARDEM's internal processes and align project execution accordingly
  • Ensure development standards and best practices are followed across all projects
  • Manage crisis situations with composure, identify root causes and drive swift resolution
  • Coordinate with cross-functional teams including HR, Operations, Training, and QA
  • Maintain project documentation, status reports, and risk registers


REQUIRED EXPERIENCE

  • 10+ years of total experience in software development and project management
  • 5–7 years of hands-on coding experience in one or more technologies listed below
  • 2–3 years in a team management or tech lead role overseeing 5+ members
  • Proven experience managing multiple simultaneous projects in a remote/WFH environment
  • Prior experience working with US-based clients strong understanding of US work culture and expectations


TECHNICAL SKILLS

  • Python: scripting, automation, data processing, backend services
  • JavaScript / Node.js: server-side development, REST APIs, async workflows
  • NET Core: enterprise application development and service integration
  • SQL Databases: query optimization, schema design, stored procedures
  • Familiarity with CI/CD pipelines, Git workflows, and deployment processes
  • Ability to review code, understand architectural decisions, and guide the team technically


SKILLS & COMPETENCIES

  • Exceptional verbal and written communication skills in English client-facing confidence is a must
  • Strong crisis management and conflict resolution ability under tight deadlines
  • Highly organized with a structured approach to planning, prioritization, and execution
  • Self-driven and accountable capable of operating independently in a remote environment
  • Strong presentation skills able to demo software to non-technical stakeholders
  • Empathetic leadership style with the ability to motivate and align diverse team members


QUALIFICATIONS

  • Bachelor's or master's degree in computer science
  • PMP Certification: Preferred (candidates without PMP must demonstrate equivalent project management rigor)
  • Agile / Scrum certifications (CSM, PMI-ACP) are an added advantage


LOCATION PREFERENCE

  • Candidates must be based in a Tier-1 city: Mumbai, Delhi NCR, Bengaluru, Hyderabad, Chennai, Pune, or Kolkata
  • This is a full-time Work From Home role: reliable internet, a dedicated workspace, and availability during US business hours are mandatory
Read more
Pune
3 - 10 yrs
₹1L - ₹10L / yr
skill iconJava
J2EE
API
Java Developer
agile
+15 more

We have an immediate requirement for a Java Developer role in the Pune location. Please find the details below:

Role: Java Developer

Experience: 3–4 Years (Mandatory)

Location: Pune

Joining: Immediate joiners only


Key Responsibilities:

  • Develop and maintain scalable and robust J2EE applications
  • Follow and implement coding standards within the project
  • Integrate with third-party APIs and services
  • Work in an Agile environment to design and implement new features
  • Support team members in resolving technical issues
  • Debug and resolve production issues (code/infrastructure)
  • Communicate effectively with team members and product management

Mandatory Skills:

  • Strong knowledge of Java and JEE internals (Class Loading, Memory Management, Transaction Management, etc.)
  • Expertise in OOPs/OOAD concepts and design patterns
  • Hands-on experience with Spring Framework and Web Services
  • Basic knowledge of JavaScript, jQuery, AJAX, and DOM
  • Good understanding of SQL, relational databases, and ORM (Hibernate/DAO)
  • Strong problem-solving skills and communication abilities

Important Note:

  • Interview is scheduled for Monday
  • Selected candidates are expected to join by Tuesday or Wednesday
Read more
Searce Inc

at Searce Inc

3 recruiters
Srishti Dani
Posted by Srishti Dani
Mumbai, Pune, Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Data migration
Datawarehousing
ETL
SQL
Google Cloud Platform (GCP)
+7 more

Lead Data Engineer


What are we looking for

real solver?

Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.


Your Responsibilities

What you will wake up to solve.

  • Lead Technical Design & Data Architecture: Architect and lead the end-to-end development of scalable, cloud-native data platforms. You’ll guide the squad on critical architectural decisions—choosing between Batch vs. Streaming or ETL vs. ELT—while remaining 100% hands-on, contributing high-quality, production-grade code.
  • Build High-Velocity Data Pipelines: Drive the implementation of robust data transports and ingestion frameworks using Python, SQL, and Spark. You will build integration layers that connect heterogeneous sources (SaaS, RDBMS, NoSQL) into unified, high-availability environments like BigQuery, Snowflake, or Redshift.
  • Mentor & Elevate the Squad: Foster a culture of technical excellence by mentoring and inspiring a team of data analysts and engineers. Lead deep-dive code reviews, promote best-practice data modeling (Star/Snowflake schema), and ensure the squad adopts modern engineering standards like CI/CD for data.
  • Drive AI-Ready Data Strategy: Be the expert in designing data foundations optimized for AI and Machine Learning. You will champion the use of GCP (Dataflow, Pub/Sub, BigQuery) and AWS (Lambda, Glue, EMR) to create "clean room" environments that fuel advanced analytics and generative AI models.
  • Partner with Clients as a Technical DRI: Act as the Directly Responsible Individual for client success. Translate ambiguous business questions into elegant data services, manage project deliverables using Agile methodologies, and ensure that the data provided is accurate, consistent, and mission-critical.
  • Troubleshoot & Optimize for Scale: Own the reliability of the reporting layer. You will proactively monitor pipelines, troubleshoot complex transformation bottlenecks, and propose ways to improve platform performance and cost-efficiency.
  • Innovate and Build Reusable IP: Spearhead the creation of reusable data frameworks, custom operators, and transformation libraries that accelerate future projects and establish Searce’s unique technical advantage in the market.


Welcome to Searce


The AI-Native tech consultancy that's rewriting the rules.

Searce is an AI-native, engineering-led, modern tech consultancy that empowers clients to futurify their business by delivering intelligent, impactful, real business outcomes. Searce solvers co-innovate with clients as their trusted transformational partners ensuring sustained competitive advantage. Searce clients realize smarter, faster, better business outcomes delivered by AI-native Searce solver squads. 


Functional Skills 

the solver personas.

  • The Data Architect: This persona deconstructs ambiguous business goals into scalable, elegant data blueprints. They don't just move data; they design the foundation—from schema design to partitioning strategies—that allows data scientists and analysts to thrive, foreseeing technical bottlenecks and making pragmatic trade-offs.
  • The Player-Coach: As a hands-on leader, this persona leads from the front by writing exemplary, production-grade SQL and Python while simultaneously mentoring and elevating the skills of the squad. Their success is measured by the team's ability to deliver high-quality, maintainable code and their growth as engineers.
  • The Pragmatic Innovator: This individual balances a passion for modern data tech (like Generative AI and Real-time Streaming) with a sharp focus on business outcomes. They champion new tools where they add real value but are disciplined enough to choose stable, cost-effective solutions to meet deadlines and deliver robust products.
  • The Client-Facing Technologist: This persona acts as the crucial technical bridge between the data squad and the client. They build trust by listening actively, explaining complex data concepts (like data latency or idempotency) in simple terms, and demonstrating how engineering decisions align with the client’s strategic goals.
  • The Quality Craftsman: This individual possesses an unwavering commitment to data integrity and treats data engineering as a craft. They are the guardian of the reporting layer, advocating for robust testing, data validation frameworks, and clean, modular code to ensure the long-term reliability of the data platform.


Experience & Relevance 

  • Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads.
  • Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.
  • AI-Native Workflow: You don’t just build for AI; you build with AI. You must be proficient in using AI coding assistants (e.g., GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative AI.
  • Architectural Portfolio: Evidence of leading 2-3 large-scale transformations—including platform migrations, data lakehouse builds, or real-time analytics architectures.
  • Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO’s business vision into a Lead Engineer’s technical specification without losing anything in translation.


Join the ‘real solvers’

ready to futurify?

If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’. Don’t Just Send a Resume. Send a Statement.





Read more
Risosu Consulting LLP
Remote only
2 - 4 yrs
₹6L - ₹9L / yr
skill iconData Analytics
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
API
+1 more

Job Title: Data Analyst (AI/ML Exposure)

Experience: 1–3 Years

Location: Mumbai

Job Description:

We are looking for a Data Analyst with strong experience in data handling, analysis, and visualization, along with exposure to AI/ML concepts. The role involves working with structured and unstructured data (SQL, CSV, JSON), building data pipelines, performing EDA, and deriving actionable insights. Candidates should have hands-on experience with Python (Pandas, NumPy), data visualization tools, and basic knowledge of NLP/LLMs. Exposure to APIs, data-driven applications, and client interaction will be an added advantage.

Skills Required: Python, SQL, Data Analysis, EDA, Visualization, APIs

Apply: Share your resume or connect with us.


Read more
Appiness Interactive
Chennai
6 - 12 yrs
₹10L - ₹24L / yr
skill iconPython
PowerBI
SQL
databricks
Data Warehouse (DWH)
+1 more

Overview


We are looking for a highly skilled Lead Data Engineer with strong expertise in Data Warehousing & Analytics to join our team. The ideal candidate will have extensive experience in designing and managing data solutions, advanced SQL proficiency, and hands-on expertise in Python & POWER BI .


Skills : Python, Databricks, SQL


Key Responsibilities:


  • Design, develop, and maintain scalable data warehouse solutions.
  • Write and optimize complex SQL queries for data extraction, transformation, and reporting.
  • Develop and automate data pipelines using Python.
  • Work with AWS cloud services for data storage, processing, and analytics.
  • Collaborate with cross-functional teams to provide data-driven insights and solutions.
  • Ensure data integrity, security, and performance optimization.

 


Required Skills & Experience:


  • Must have a minimum of 6-10 years of experience in Data Warehousing & Analytics.
  • Must have strong experience in Databricks
  • Strong proficiency in writing complex SQL queries with deep understanding of query optimization, stored procedures, and indexing.
  • Hands-on experience with Python for data processing and automation.
  • Experience working with AWS cloud services.
  • Hands-on experience with reporting tools like Power BI or Tableau.
  • Ability to work independently and collaborate with teams across different time zones.


Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Hyderabad
4 - 6 yrs
Best in industry
skill iconPHP
skill iconLaravel
Object Oriented Programming (OOPs)
MVC Framework
Design patterns
+4 more

We are looking for a Staff Engineer - PHP to join one of our engineering teams at our office in Hyderabad.


What would you do?

  • Design, build, and maintain backend systems and APIs from requirements to production.
  • Own feature development, bug fixes, and performance optimizations.
  • Ensure code quality, security, testing, and production readiness.
  • Collaborate with frontend, product, and QA teams for smooth delivery.
  • Diagnose and resolve production issues and drive long-term fixes.
  • Contribute to technical discussions and continuously improve engineering practices.


Who Should Apply?

  • 4–6 years of hands-on experience in backend development using PHP.
  • Strong proficiency with Laravel or similar PHP frameworks, following OOP, MVC, and design patterns.
  • Solid experience in RESTful API development and third-party integrations.
  • Strong understanding of SQL databases (MySQL/PostgreSQL); NoSQL exposure is a plus.
  • Comfortable with Git-based workflows and collaborative development.
  • Working knowledge of HTML, CSS, and JavaScript fundamentals.
  • Experience with performance optimization, security best practices, and debugging.
  • Nice to have: exposure to Docker, CI/CD pipelines, cloud platforms, and automated testing.


Read more
NeoGenCode Technologies Pvt Ltd
Gurugram, Vadodara
5 - 10 yrs
₹6L - ₹20L / yr
Manual testing
Test Automation (QA)
Crypto Exchange
Selenium
cypress
+8 more

Job Title : Senior QA Engineer (Crypto Exchange Platform)

Experience : 5+ Years

Location : Gurugram & Vadodara

Employment Type : Full-Time


About the Company :

We are a fast-growing crypto exchange platform building secure, scalable, and high-performance trading systems with real-time data and wallet infrastructure.


Role Overview :

We are looking for a Senior QA Engineer to ensure the quality, reliability, and security of our platform. You’ll work on web, mobile, and backend systems, focusing on APIs, trading engines, and real-time systems in a fast-paced agile environment.


Mandatory Skills :

5+ years in QA with strong manual & automation testing, experience in Selenium/Cypress/Playwright, API testing (Postman/REST Assured), CI/CD (Jenkins/GitHub Actions), SQL, and real-time/WebSocket testing.


Key Responsibilities :

  • Create and execute test plans, cases, and strategies
  • Perform functional, regression, integration & API testing
  • Build and maintain automation frameworks
  • Test trading systems, wallets, and real-time data (WebSockets)
  • Track bugs using Jira and collaborate with teams
  • Integrate testing into CI/CD pipelines
  • Ensure performance, stability, and security


Required Skills :

  • Strong experience in automation + functional testing
  • Hands-on with Selenium/Cypress/Playwright
  • Good knowledge of API testing & microservices
  • Experience with CI/CD tools
  • Strong SQL & database validation skills
  • Understanding of Agile & SDLC


Good to Have :

  • Experience in crypto/fintech/trading platforms
  • Knowledge of blockchain, wallets, smart contracts
  • Performance testing (JMeter, K6)
  • Basic security testing knowledge


What We’re Looking For :

  • Strong problem-solving skills
  • Attention to detail
  • Ability to work in a fast-paced environment
  • Good communication & ownership mindset
Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹10L - ₹13L / yr
ETL QA
IBM Datastage
SQL
Datawarehousing

Key Skills:

  • ETL Testing (Functional, Regression, Integration)
  • IBM DataStage
  • SQL (Joins, Subqueries, Aggregations, Data Validation)
  • Data Warehousing Concepts (Star/Snowflake Schema)
  • Test Case Design & Execution
  • Defect Management (JIRA, HP ALM)
  • Agile & Waterfall Methodologies


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore)
4 - 10 yrs
₹10L - ₹30L / yr
skill iconPython
SQL
Spark
skill iconAmazon Web Services (AWS)
Amazon S3
+13 more

Job Title : AWS Data Engineer

Experience : 4+ Years

Location : Bengaluru (HSR – Hybrid, 3 Days WFO)

Notice Period : Immediate Joiner


💡 Role Overview :

We are looking for a skilled AWS Data Engineer to design, build, and scale modern data platforms. The role involves working with AWS-native services, Python, Spark, and DBT to deliver secure, scalable, and high-performance data solutions in an Agile environment.


🔥 Mandatory Skills :

Python, SQL, Spark, AWS (S3, Glue, EMR, Redshift, Athena, Lambda), DBT, ETL/ELT pipeline development, Airflow/Step Functions, Data Lake (Parquet/ORC/Iceberg), Terraform & CI/CD, Data Governance & Security


🚀 Key Responsibilities :

  • Design, build, and optimize ETL/ELT pipelines using Python, DBT, and AWS services
  • Develop and manage scalable data lakes on S3 using formats like Parquet, ORC, and Iceberg
  • Build end-to-end data solutions using Glue, EMR, Lambda, Redshift, and Athena
  • Implement data governance, security, and metadata management using Glue Data Catalog, Lake Formation, IAM, and KMS
  • Orchestrate workflows using Airflow, Step Functions, or AWS-native tools
  • Ensure reliability and automation via CloudWatch, CloudTrail, CodePipeline, and Terraform
  • Collaborate with data analysts and data scientists to deliver actionable insights
  • Work in an Agile environment to deliver high-quality data solutions

✅ Mandatory Skills :

  • Strong Python (including AWS SDKs), SQL, Spark
  • Hands-on experience with AWS data stack (S3, Glue, EMR, Redshift, Athena, Lambda)
  • Experience with DBT and ETL/ELT pipeline development
  • Workflow orchestration using Airflow / Step Functions
  • Knowledge of data lake formats (Parquet, ORC, Iceberg)
  • Exposure to DevOps practices (Terraform, CI/CD)
  • Strong understanding of data governance and security best practices
  • Minimum 4–7 years in Data Engineering (3+ years on AWS)

➕ Good to Have :

  • Understanding of Data Mesh architecture
  • Experience with platforms like Data.World
  • Exposure to Hadoop / HDFS ecosystems

🤝 What We’re Looking For :

  • Strong problem-solving and analytical skills
  • Ability to work in a collaborative, cross-functional environment
  • Good communication and stakeholder management skills
  • Self-driven and adaptable to fast-paced environments

📝 Interview Process :

  1. Online Assessment
  2. Technical Interview
  3. Fitment Round
  4. Client Round
Read more
Vikgol
Remote only
3 - 5 yrs
₹15L - ₹18L / yr
SQL
skill iconPython
Linux/Unix
Large Language Models (LLM) tuning
skill iconMachine Learning (ML)
+1 more

Python Developer (Performance Optimization Focus)

Experience: 3–5 Years

Location: Remote (India-based candidates only)

Employment Type: Full-time


Role Overview

We are seeking a Python Developer with a strong focus on performance optimization and system efficiency. In this role, you will identify bottlenecks, enhance system performance, and contribute to building scalable, high-performance applications in a Linux-based environment.


Key Responsibilities

  • Analyze and troubleshoot performance bottlenecks in applications and systems
  • Optimize code, database queries, and architecture for scalability and speed
  • Design, develop, test, and maintain robust Python applications
  • Work with large datasets and improve data processing efficiency
  • Collaborate with cross-functional teams to improve system reliability and performance
  • Monitor system performance and implement proactive improvements
  • Write clean, maintainable, and efficient code following best practices


Required Skills & Qualifications

  • 3–5 years of hands-on experience in Python development
  • Strong expertise in performance tuning and optimization techniques
  • Experience with debugging and profiling tools
  • Solid understanding of data structures and algorithms
  • Experience with REST APIs and backend development
  • Strong analytical and problem-solving skills


Linux & System Knowledge (Must-Have)

  • Comfortable working in Linux/Unix environments
  • Command-line proficiency, including:
  • File editing (vi, nano)
  • File permissions (chmod, chown)
  • File downloads (wget, curl)
  • Basic file and directory operations


Basic Python Knowledge (Interview Scope)

  • Writing simple scripts and reusable functions
  • String manipulation and data handling
  • Example task: Count words in a file/string efficiently


Good to Have

  • Familiarity with AI/ML concepts or tools
  • Experience optimizing data-intensive or distributed systems
  • Exposure to cloud platforms (AWS, GCP, Azure)


Why Join Us

  • Work on performance-critical systems with real-world impact
  • Fully remote work environment
  • Opportunity to work with modern, scalable technologies
  • Collaborative, growth-focused team culture


Read more
WeAssemble
Meghal Majithia
Posted by Meghal Majithia
Mumbai
3 - 6 yrs
₹5L - ₹8L / yr
Selenium
Playwright
SQL
Test Automation (QA)

We are looking for a highly skilled QA Automation Engineer with at least 3 years of experience to join our dynamic team in Mumbai. The ideal candidate should be proactive, detail-oriented, and ready to hit the ground running.


Company's Name:-WeAssemble

Reach US:- www.weassemble.team

Location:- One International Centre, Prabhadevi, Mumbai 

Working days:- Monday - Friday / Sat & Sun Fixed Off

Location: Prabhadevi , Mumbai

*Key Responsibilities:*

* Design, develop, and execute automated test scripts using industry-standard tools and frameworks.

* Collaborate with developers, business analysts, and product managers to ensure product quality.

* Conduct functional, non-functional, API, integration testing.

* Implement and maintain automation frameworks.

* Contribute to continuous improvement in QA processes.

*Required Skills & Experience:*

* Strong experience in Playwright with JavaScript.

* API Testing Automation (Postman, REST Assured, or equivalent).

* Hands-on experience with CI/CD pipelines (Jenkins, GitHub Actions, GitLab, or similar).

* Solid understanding of software QA methodologies, tools, and processes.

* Ability to identify, log, and track bugs effectively.

* Strong problem-solving and analytical skills.

*Good to Have:*

* Knowledge of performance testing tools.

* Familiarity with cloud platforms (AWS, Azure, or GCP).

Read more
Cglia Solutions LLP
Rajana Harika
Posted by Rajana Harika
Hyderabad
1 - 3 yrs
₹2.4L - ₹4L / yr
Linux administration
skill iconAmazon Web Services (AWS)
skill iconDocker
SQL
PL/SQL
+2 more



Experience: 1–3 Years

Qualification: B.Tech (Computer Science / IT or related field)

Shift Timing: 5:00 PM – 2:00 AM (Late Evening Shift)

Location: Hyderabad 


Job Summary


We are seeking a proactive and detail-oriented Application Support Engineer with 1–3 years of experience in Linux/Windows environments, application servers, and monitoring tools. The candidate will be responsible for ensuring the stability, performance, and availability of applications, along with providing L2/L3 support in a fast-paced production environment.

Key Responsibilities :

  • Provide application support and incident management for production systems.
  • Monitor system performance using hardware/software monitoring and trending tools.
  • Troubleshoot issues in Linux and Windows environments.
  • Manage and support Apache and Tomcat servers.
  • Analyze logs and debug application/system issues.
  • Work on SQL/Oracle databases for query execution, troubleshooting, and performance tuning.
  • Handle deployments and support CI/CD pipelines using tools like Docker and Jenkins.
  • Ensure SLA adherence and timely resolution of incidents and service requests.
  • Coordinate with development, infrastructure, and database teams for issue resolution.
  • Maintain documentation for incidents, processes, and knowledge base articles.
  • Support SaaS applications hosted in data center environments. 

Required Skills :

Strong knowledge of Linux and Windows OS administration

 Experience with Apache and Tomcat servers

 Hands-on experience with monitoring and alerting tools

 Good understanding of log analysis and troubleshooting techniques

Working knowledge of SQL / Oracle databases

 Familiarity with Docker and Jenkins (CI/CD pipelines)

 Understanding of ITIL processes (Incident, Problem, Change Management)

 Knowledge of SaaS applications and data center operations.


Preferred Skills :

 Experience with automation/scripting (Shell, Python, etc.)

 Exposure to cloud platforms (AWS/Azure/GCP) is a plus

 Basic networking knowledge


Soft Skills :

 Strong analytical and problem-solving abilities

 Good communication skills

 Ability to work in night shifts and handle production support

 Team player with a proactive attitude 

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune
4 - 6 yrs
₹5L - ₹12L / yr
Angular
SQL
skill iconJavascript
skill iconHTML/CSS

👉 Job Title: Angular Developer

🌟 Experience: 4 Years

💡Location: Pune (Hybrid)

👉 Notice Period :- Immediate Joiners

( Candidate Serving notice period are preffered)

💫 Interview Mode :- Walk in Interview ( Baner Location)


Job Overview

We are looking for a skilled Angular Developer with 4 years of experience to join our dynamic development team. The ideal candidate will have strong expertise in Angular, JavaScript, and SQL, with the ability to build high-performance, scalable web applications. This is a hybrid role based in Pune.


Key Responsibilities

  • Develop and maintain responsive web applications using Angular.
  • Write clean, scalable, and efficient JavaScript code.
  • Collaborate with cross-functional teams including designers, backend developers, and product managers.
  • Integrate frontend applications with backend services and APIs.
  • Optimize applications for maximum speed and scalability.
  • Troubleshoot, debug, and upgrade existing applications.
  • Work with SQL databases for data querying and manipulation.
  • Ensure code quality through best practices, code reviews, and testing.


Required Skills & Qualifications

  • 4 years of hands-on experience in Angular development.
  • Strong proficiency in JavaScript (ES6+).
  • Solid understanding of HTML5, CSS3, and responsive design.
  • Experience working with SQL databases.
  • Familiarity with RESTful APIs and web services.
  • Knowledge of version control systems like Git.
  • Strong problem-solving and analytical skills.
  • Good communication and teamwork abilities.


Read more
Educational Co

Educational Co

Agency job
via Vikash Technologies by Rishika Teja
Bengaluru (Bangalore)
4 - 6 yrs
₹15L - ₹30L / yr
skill iconNodeJS (Node.js)
TypeScript
skill iconJavascript
SQL

Requirements :


4-6 years of experience as a backend developer.


Strong in Node.js, TypeScript/JavaScript, and SQL (MySQL or similar RDBMS) -


SQL is mandatory


Proven track record of independently building and owning features or modules in production systems.


Strong grasp of web fundamentals: HTTP, REST APIs, authentication, request-response lifecycle.


Experience working on Linux-based environments and Git/GitHub workflows.


Awareness of how their module fits into the larger product architecture and business goals.

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Remote only
5 - 10 yrs
₹6.5L - ₹22L / yr
Tableau
SQL
Business Intelligence (BI)
Thoughtspot


Hiring: Senior BI Architech

Experience: 5+ Years

Work Mode :- Remote

Notice Period :- Immediate Joiners

( Or Candidate Serving Notice period )


About the Role

We are looking for a seasoned Senior Business Intelligence Architect who can bridge the gap between deep technical engineering and functional business insight delivery. This role demands an architect who doesn't just build dashboards — but designs enterprise-grade BI ecosystems that drive executive decision-making at scale.


 4 Mandatory Skills

These are non-negotiable. Candidates without proficiency in all three will not be considered.

  1. Tableau — Advanced dashboard design, REST API integration, JavaScript SDK embedding, extract optimization, and CI/CD deployment via Git.
  2. SQL — Expert-level query tuning, complex joins, incremental refresh strategies, and performance optimization for large-scale data models.
  3. Power BI - with the ability to architect migration paths or co-existence strategies for self-serve and executive reporting.
  4. ThoughtSpot — with the ability to architect migration paths or co-existence strategies for self-serve and executive reporting.


Key Responsibilities


  • Lead end-to-end design and publishing of sophisticated Tableau and Cognos solutions with a focus on interactivity and executive-grade narrative storytelling.
  • Own the full optimization lifecycle — advanced query tuning, incremental extract strategies, and dashboard load-time reduction.
  • Architect seamless "Analytics as a Service" integrations by embedding BI content into external applications via Tableau REST APIs and JavaScript SDKs.
  • Provide expert guidance on Power BI and ThoughtSpot migration strategies and co-existence models for enterprise reporting.
  • Elevate data storytelling through UX principles, accessibility standards, and custom visualizations using D3.js for high-impact mapping and charting.
  • Implement CI/CD pipelines for BI releases using Git, ensuring rigorous version control and deployment governance for all dashboard assets.
  • Define and track performance analytics and usage metrics to measure dashboard ROI and drive organization-wide adoption.


Required Skills & Qualifications

  • 5–10 years of hands-on experience in BI development and architecture.
  • Deep expertise in Tableau, Power BI, ThoughtSpot, and SQL.
  • Proficiency in Tableau REST API and JavaScript SDK for embedded analytics.
  • Strong understanding of DataOps frameworks and scalable data pipeline design.
  • Experience with Git-based CI/CD workflows for BI asset management.
  • Familiarity with D3.js or other custom visualization libraries.
  • Excellent communication skills to translate complex data into compelling business narratives for executive stakeholders.



Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Hyderabad
6 - 10 yrs
₹10L - ₹30L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconReact Native
skill iconAngular (2+)
SQL
+14 more

Key Requirements / Skills

  • 6+ years of overall experience in software development with strong expertise in building scalable web applications.
  • 2+ years of experience as a Technical Lead, managing development teams and driving project delivery.
  • Strong technical decision-making ability, including architecture design, technology selection, and implementation of best practices.
  • Front-end expertise: Strong experience in React, JavaScript, TypeScript, and building responsive and user-friendly UI/UX.
  • Back-end development: Hands-on experience with Node.js, RESTful APIs, API design, and server-side architecture.
  • AI/ML knowledge: Experience in implementing AI/ML models or integrating AI-based solutions to solve business problems.
  • Cloud & DevOps exposure: Experience with AWS/Azure, understanding of CI/CD pipelines, and cloud-based deployments.
  • Code quality & best practices: Experience in code reviews, Git version control, and ensuring maintainable and secure code.
  • Team leadership: Ability to mentor developers, guide technical discussions, and collaborate across teams.
  • Strong communication skills to effectively interact with technical and non-technical stakeholders.
  • Experience working in high-compliance environments such as healthcare systems is a plus.


Education Qualifications:

  • B.Tech/M.Tech in CSE/IT/AI/ML from a good university
Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Mumbai, Trivandrum
4 - 7 yrs
Upto ₹30L / yr (Varies
)
Google Cloud Platform (GCP)
SQL
ETL
Datawarehousing
Data-flow analysis

We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.


Key Responsibilities

  • Collaborate with business users and stakeholders to understand business processes and data requirements
  • Design and implement dimensional data models, including fact and dimension tables
  • Identify, design, and implement data transformation and cleansing logic
  • Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
  • Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
  • Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
  • Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
  • Provide high-level design, research, and effort estimates for data integration initiatives
  • Provide production support for ETL processes to ensure data availability and SLA adherence
  • Analyze and resolve data pipeline and performance issues
  • Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
  • Translate business requirements into well-defined technical data specifications
  • Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
  • Define and document BI usage through use cases, prototypes, testing, and deployment
  • Support and enhance data governance and data quality processes
  • Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
  • Train and support business users, IT analysts, and developers
  • Lead and collaborate with teams spread across multiple locations

Required Skills & Qualifications

  • Bachelor’s degree in Computer Science or a related field, or equivalent work experience
  • 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
  • Strong expertise in data warehousing concepts, tools, and best practices
  • Excellent SQL skills
  • Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
  • Hands-on experience with Google Cloud Platform (GCP) services, including:
  1. BigQuery
  2. Cloud SQL
  3. Cloud Composer (Airflow)
  4. Dataflow
  5. Dataproc
  6. Cloud Functions
  7. Google Cloud Storage (GCS)
  • Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
  • Strong experience integrating data using APIs, XML, JSON, and similar formats
  • In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
  • Solid understanding of SDLC, Agile, and Scrum methodologies
  • Strong problem-solving, multitasking, and organizational skills
  • Experience handling large-scale datasets and database design
  • Strong verbal and written communication skills
  • Experience leading teams across multiple locations

Good to Have

  • Experience with SSRS and SSIS
  • Exposure to AWS and/or Azure cloud platforms
  • Experience working with enterprise BI and analytics tools

Why Join Us

  • Opportunity to work on large-scale, enterprise data platforms
  • Exposure to modern cloud-native data engineering technologies
  • Collaborative environment with strong stakeholder interaction
  • Career growth and leadership opportunities
Read more
MNC Company

MNC Company

Agency job
via Techno Wise by Chanchal Amin
Pune
5 - 9 yrs
₹7L - ₹30L / yr
MVC Framework
skill iconAngular (2+)
Microsoft Windows Azure
SQL
Microservices
+11 more

Essential Functions/Responsibilities

• Provide hands-on development in the application development, unit test, and rollout of

strategic web and Mobile initiatives.

• Develop both front-end and back-end for web/mobile applications, working with a hybrid

internal/vendor team, to support various lines of business and functional areas

• Work with Business Owners and Business Analysis teams, to create business requirements.

• Document technical requirements and technical specifications for Web/Mobile

applications (and related integrated solutions) and provide technical solutions to support

those needs.

• Provide feedback (and approval) on technical designs and methods to support business

requirements.

• Effectively communicate relevant project planning and status information to

leadership/management.

• Deliver engaging, informa<ve, well-organized demos/presenta<ons that are effectively

tailored to the intended audience, as needed

Read more
LogIQ Labs Pvt.Ltd.
Remote only
4 - 5 yrs
₹8L - ₹14L / yr
skill icon.NET
skill iconReact.js
SQL

Job Summary: We are seeking a highly skilled and experienced Senior .NET Developer to join our dynamic development team. The ideal candidate will have a strong background in developing robust, scalable, and high-performance applications using the Microsoft .NET framework, coupled with significant expertise in SQL Server. You will be instrumental in designing, developing, and maintaining complex software solutions, collaborating with cross-functional teams, and mentoring junior developers.

Responsibilities:

  • Design, develop, test, deploy, and maintain high-quality, scalable, and secure applications using C#, .NET/.NET Core, and related technologies.
  • Lead the development of key modules and features, ensuring adherence to coding standards, best practices, and architectural guidelines.
  • Collaborate with product owners, business analysts, and other stakeholders to understand requirements, translate them into technical specifications, and propose effective solutions.
  • Develop and optimize complex SQL queries, stored procedures, functions, and database schemas for optimal performance and data integrity.
  • Perform code reviews, provide constructive feedback, and ensure the quality and maintainability of the codebase.
  • Troubleshoot, debug, and resolve software defects and production issues in a timely manner.
  • Actively participate in the entire software development life cycle (SDLC), including requirements gathering, design, development, testing, deployment, and support.
  • Mentor and guide junior developers, fostering their growth and ensuring best practices are followed.
  • Stay up-to-date with emerging technologies and industry trends, evaluating and recommending new tools and practices to improve development efficiency and product quality.
  • Contribute to the continuous improvement of development processes and methodologies.

Required Skills and Experience:

  • Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
  • Minimum of 4+ years of professional experience in software development with a strong focus on the Microsoft .NET ecosystem.
  • Proficiency in C# and extensive experience with .NET Framework (4.x) and/or .NET Core/.NET 5+.
  • Solid understanding and practical experience with ASP.NET MVC, Web API, 
  • Strong SQL skills with proven experience in designing databases, writing complex queries, stored procedures, functions, and optimizing database performance in Microsoft SQL Server.
  • Experience with front-end technologies such as HTML5, CSS3, JavaScript, jQuery, and at least one modern JavaScript framework (e.g., Angular, React, Vue.js) is a plus.
  • Familiarity with ORM frameworks such as Entity Framework or Dapper.
  • Experience with version control systems, particularly Git.
  • Understanding of object-oriented programming (OOP) principles, design patterns, and software development best practices.
  • Excellent problem-solving, analytical, and debugging skills.
  • Strong communication and interperson


Read more
AtDrive Group
Shreya Pareek
Posted by Shreya Pareek
Remote only
1 - 2 yrs
₹3L - ₹4.2L / yr
ASP.NET MVC
RESTful APIs
SQL
ASP.NET AJAX
skill iconHTML/CSS
+4 more


About Us

AtDrive Infotech is a forward-thinking IT company delivering innovative and scalable software solutions using modern technologies. We focus on building reliable, secure, and intelligent applications that enhance business performance.

We are looking for a motivated Software Development Engineer (ASP.NET) who is eager to learn, contribute to development projects, and work on modern technologies including AI-integrated applications.

Key Responsibilities

  • Assist in designing, developing, and maintaining ASP.NET-based applications.
  • Participate in the software development lifecycle, including coding, testing, debugging, and deployment.
  • Develop and maintain ASP.NET MVC and Web API applications.
  • Work on frontend development using HTML5, CSS3, JavaScript, and jQuery.
  • Develop and integrate REST APIs and third-party services.
  • Write efficient SQL queries and assist in database development.
  • Debug technical issues and optimize application performance.
  • Perform real-time testing during development to ensure code quality.
  • Collaborate with team members and follow coding standards.
  • Support deployment and maintenance activities.

Qualifications

  • Bachelor’s degree in Computer Science, IT, or related field.
  • 1–2 years of experience in ASP.NET development.

Required Technical Skills

  • Good knowledge of C# and .NET Framework.
  • Experience with ASP.NET MVC and Web API development.
  • Basic knowledge of Web Services and API integration.
  • Strong understanding of HTML5 and CSS3.
  • Experience with JavaScript and jQuery.
  • Familiarity with AJAX-based client-server communication.
  • Basic knowledge of SQL Server and Stored Procedures.
  • Familiarity with Entity Framework or LINQ.
  • Understanding of JSON and XML data formats.
  • Basic knowledge of Object-Oriented Programming (OOP) concepts.
  • Familiarity with Git or version control systems.

AI Skills (Desired)

  • Basic understanding of Artificial Intelligence (AI) concepts.
  • Familiarity with using AI-based APIs such as chatbots or automation tools.
  • Willingness to learn and work on AI-integrated applications.

Additional Skills (Preferred)

  • Exposure to AngularJS or similar frontend frameworks.
  • Familiarity with WPF / WinForms concepts.
  • Basic knowledge of n-tier architecture.
  • Understanding of debugging and performance optimization techniques.

Personal Attributes

  • Strong problem-solving and troubleshooting skills.
  • Ability to work in a fast-paced development environment.
  • Good communication and teamwork skills.
  • Self-motivated and eager to learn new technologies.
  • Ability to prioritize tasks and meet deadlines.
  • Strong ownership and accountability mindset.

Roles Expectations

  • Deliver assigned development tasks within deadlines.
  • Maintain clean and maintainable code.
  • Follow development and quality standards.
  • Continuously learn new technologies and frameworks.
  • Support senior developers in technical implementations.

Job Details

Job Title: Software Development Engineer – ASP.NET

Job Type: Full-time

Work Location: Remote

Salary Range:

₹25,000 – ₹35,000 per month

(Negotiable based on skills and technical performance)

Benefits

  • Paid sick leave
  • Paid time off
  • Work from home

Schedule

  • Day shift
  • Monday to Friday (Mon-Sat during Probation)


Read more
httpswwwicloudemscomvlog
Remote only
3 - 6 yrs
₹3L - ₹6L / yr
skill iconPHP
SQL
edtech
Learning Management System (LMS)
skill iconMongoDB
+5 more

Software Engineer – EdTech (PHP)

Experience: 3+ Years

Work Mode: Permanent Work From Home

Role Summary

We are seeking a highly skilled software developer with strong experience in EdTech platforms and education ERP systems. The ideal candidate will have expertise in core PHP/Laravel and database technologies, with hands-on experience in building and scaling education-focused modules such as LMS, online examination systems, admissions, and fee management.

This role focuses on developing scalable, secure, and high-performance solutions for schools, colleges, and online learning platforms.

Key Responsibilities

  • Design, develop, and maintain Education ERP and EdTech platform modules.
  • Build and enhance systems for LMS (Learning Management System), online exams, admissions, fee management, HR, and finance.
  • Develop and optimize REST APIs/GraphQL services for seamless integration with web and mobile platforms.
  • Ensure high performance, scalability, and security for large-scale student and institutional data.
  • Work closely with product, QA, and implementation teams to deliver EdTech features.
  • Conduct code reviews, maintain coding standards, and mentor junior developers.
  • Continuously improve platform capabilities based on EdTech trends and user needs.

Required Skills & Qualifications

  • Strong expertise in Core PHP (Laravel Framework).
  • Solid experience with MySQL, MongoDB, PostgreSQL (database design & optimization).
  • Understanding of EdTech workflows like student lifecycle, course management, and assessments.
  • Frontend basics: JavaScript, jQuery, HTML, CSS (React/Vue is a plus).
  • Experience with REST APIs, GraphQL, and third-party integrations (payment gateways, SMS, and email services).
  • Familiarity with Git/GitHub, Docker, and CI/CD pipelines.
  • Knowledge of cloud platforms (AWS, Azure, GCP) is an advantage.
  • Minimum 3+ years of development experience, with at least 2 years in education ERP/EdTech systems.

Preferred Experience

  • Prior experience working in EdTech companies or education ERP platforms.
  • Deep understanding of LMS, online examination systems, admissions, fees, HR, and finance modules.
  • Experience handling high-traffic educational platforms (e.g., exam portals, live classes, student dashboards).
  • Exposure to scalable architecture for large student/user bases.


Read more
Service Based Company in Mohali and Noida

Service Based Company in Mohali and Noida

Agency job
via WITS Innovation Lab by Prabhnoor Kaur
Noida, Mohali
3 - 6 yrs
₹5L - ₹14L / yr
skill iconSpring Boot
Microservices
skill iconJava
RESTful APIs
SQL
+1 more

•      3+ years of hands-on experience developing and testing highly scalable software.

•      Excellent coding skills in Java 17 or above.

•      Very good understanding of any RDBMS and/or messaging queues

•      Proficient in Core java, Solid foundation in object-oriented development and design patterns.

•      Excellent problem-solving skills and attention to detail.

•      Ability to engineer complex features/systems from scratch and drive it to completion.

•      Good knowledge of multiple data storage systems.

•      Prior experience in micro services and event driven architecture.

•      Experience with Spring boot and Spring Security Framework

•      Spring web-flux understanding is desirable

•      Understand OWASP Top 10/CWE, DAST and SAST

Read more
StarApps Studio

at StarApps Studio

2 candid answers
4 products
Shivani Kawade
Posted by Shivani Kawade
Pune
1 - 4 yrs
₹4L - ₹7L / yr
Software Testing (QA)
Test Automation (QA)
SQL
SaaS
Selenium
+4 more

Star Apps is looking for a detail-oriented and technically-driven QA Engineer to join our product team in Baner, Pune. In this role, you will move beyond simple bug-finding to focus on the reliability and scalability of our SaaS platform.


The ideal candidate has a "product-first" mindset, having previously worked in SaaS or Product-based environments, and possesses a strong understanding of how data moves through complex systems.


What You’ll Do

  • Test Engineering: Develop and maintain automated test suites (UI and API) to ensure high-quality releases.
  • SaaS Integration: Work closely with Engineering teams to integrate automated tests into the pipeline.
  • Bug Advocacy: Not just reporting bugs, but performing root-cause analysis to help developers fix issues at the source.
  • Collaboration: Participate in sprint planning and design reviews to provide "testability" feedback early in the development cycle.


What We’re Looking For

  • The "Product" Lens: 1–3 years of experience specifically within SaaS or Product companies. You understand that quality isn't just about code, but about the user's journey.
  • Technical Proficiency: Proven experience with automation frameworks (e.g., Selenium, Cypress, Playwright) and API testing tools (e.g., Postman, RestAssured).
  • System Logic: Ability to test backend workflows, including logic involving queues, webhooks, and third-party integrations.
  • On-site Energy: A desire to work out of our Baner office 5 days a week, contributing to a high-collaboration, face-to-face team environment.
  • Education: A degree in Computer Science, IT, or a related technical field.


Preferred Skills

  • Basic understanding of SQL for data verification.
  • Experience with performance testing or security testing basics.
  • Familiarity with cloud platforms (AWS, Azure, or GCP).


Why Star Apps?

  • Impact: Your work directly affects our core product and global customer base.
  • Growth: A fast-paced environment where your 1–3 years of experience will quickly scale through mentorship and ownership.
  • Location: Work from the heart of Pune's tech hub in Baner.
Read more
Gradera AI Technologies
Hyderabad
3 - 9 yrs
₹8L - ₹30L / yr
SQL
skill iconPython
Spark
• Databricks

Data Quality Engineer

Engineering - Hyderabad, Telangana


About Gradera — Digital Twin & Physical AI Platform 

At Gradera, we are building a next-generation Digital Twin and Physical AI platform that enables enterprises to model, simulate, and optimize complex real-world systems. Our work brings together strategy, architecture, data, simulation, and experience design to power decision-making across large-scale operational environments such as manufacturing, logistics, and supply chain networks. 

 

This platform-led initiative applies AI-native execution, advanced simulation, and governed orchestration to help organizations test scenarios, predict outcomes, and continuously improve performance. We operate with an enterprise-first mindset prioritizing reliability, transparency, and measurable business impact as we build intelligent systems that scale beyond a single industry or use case. 

Data Quality Engineer 


Overview 

We are seeking a detail-oriented Data Quality Engineer to ensure the integrity, accuracy, and reliability of data powering our digital twin and AI platforms. You will design and implement data quality frameworks, build automated validation pipelines, and establish quality metrics that enable trusted, simulation-ready data products. This role is critical to ensuring that operational decisions and ML models are built on a foundation of high-quality, governed data. 

Our core data quality stack includes: 

Data Quality Frameworks 

  • Delta Live Tables expectations for declarative quality enforcement 
  • Great Expectations for comprehensive data validation 
  • Databricks data profiling and quality monitoring 

Platform & Tools 

  • Databricks SQL and PySpark for quality checks at scale 
  • Unity Catalog for lineage tracking and governance compliance 
  • Python for custom validation logic and anomaly detection 

Observability 

  • Quality metrics dashboards and alerting 
  • Data profiling and statistical analysis 
  • Anomaly detection and drift monitoring 

Key Responsibilities 

  • Design and implement data quality frameworks using Delta Live Tables expectations and Great Expectations 
  • Build automated data validation pipelines that enforce quality standards at ingestion and transformation stages 
  • Develop data profiling processes to understand data distributions, patterns, and anomalies 
  • Define and track data quality metrics (completeness, accuracy, consistency, timeliness, validity) 
  • Implement anomaly detection mechanisms to identify data drift and quality degradation 
  • Create quality dashboards and alerting systems for proactive issue identification 
  • Collaborate with data engineers to embed quality checks into ETL/ELT pipelines 
  • Partner with data architects to establish data quality standards and governance policies 
  • Investigate and perform root cause analysis for data quality issues 
  • Document data quality rules, thresholds, and remediation procedures 
  • Support data certification processes for simulation-ready and ML-ready datasets 
  • Drive continuous improvement in data quality practices and tooling 

Preferred Qualifications 

  • 6+ years of experience in data engineering or data quality roles, with 3+ years focused on data quality 
  • Track record of implementing enterprise-scale data quality frameworks 
  • Experience with Lakehouse architectures (Delta Lake, Iceberg) 
  • Familiarity with real-time data quality monitoring for streaming pipelines 
  • Experience working in agile, cross-functional teams 

Highly Desirable 

  • Experience with data quality for digital twin or simulation platforms 
  • Familiarity with operational state data validation and temporal consistency checks 
  • Experience with graph data quality validation (Neo4j or similar) 
  • Exposure to ML data quality (feature validation, training data quality) 
  • Experience with data observability platforms 
  • Exposure to industrial domains such as Manufacturing, Logistics, or Transportation is a plus 

 

Location: Hyderabad, Telangana 

Department: Engineering 

Employment Type: Full-Time 


Read more
Snabbit
Shweta Vyas
Posted by Shweta Vyas
Bengaluru (Bangalore)
3 - 7 yrs
₹25L - ₹45L / yr
Mobile App Development
Backend
skill iconPython
skill iconJava
skill iconGo Programming (Golang)
+2 more

We are looking for a strong Mobile Engineer with backend exposure who can own end-to-end feature development. This is a mobile-heavy fullstack role where you will primarily build scalable mobile applications while contributing to backend services and APIs.

Key Responsibilities

  • Design and develop high-quality mobile applications (primary focus)
  • Build and integrate RESTful APIs and backend services
  • Collaborate with product and design teams to ship features end-to-end
  • Ensure performance, scalability, and reliability of mobile apps
  • Write clean, maintainable, and testable code
  • Participate in architecture discussions and technical decision-making

Must Have Skills

  • Strong experience in mobile development (Flutter / React Native / iOS / Android)
  • Solid understanding of backend development (Node.js / Java / Python / Go)
  • Experience with API design, microservices, and databases
  • Good understanding of system design and app performance optimization
  • Familiarity with cloud platforms (AWS/GCP)

Good to Have

  • Experience working in startup environments
  • Exposure to CI/CD pipelines and DevOps practices
  • Understanding of real-time systems or scalable architectures
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4.5 - 8 yrs
Best in industry
skill iconPython
SQL
FastAPI
Restapi
Artificial Intelligence (AI)
+4 more

Generative AI System Design

  • Architect and implement end-to-end LLM-powered applications
  • Build scalable RAG pipelines (chunking, embeddings, hybrid search, reranking)
  • Design and implement agent-based workflows (tool calling, multi-step reasoning, orchestration)
  • Integrate LLM APIs such as OpenAI and Anthropic, along with open-source models
  • Implement structured output validation, grounding strategies, and hallucination mitigation
  • Optimize inference cost, latency, and token efficiency
  • Design evaluation pipelines for performance, accuracy, and safety

2️⃣ Backend & Microservices Engineering

  • Design scalable backend systems using Python
  • Build REST and async APIs using FastAPI / Django
  • Architect and implement microservices with clear service boundaries
  • Implement service-to-service communication (REST, gRPC, event-driven messaging)
  • Work with message brokers (Kafka / RabbitMQ)
  • Optimize database performance (PostgreSQL, MongoDB)
  • Implement caching strategies (Redis)
  • Build observability: logging, monitoring, distributed tracing

3️⃣ Cloud-Native Architecture & DevOps

  • Design and deploy containerized services using Docker
  • Orchestrate services using Kubernetes
  • Implement CI/CD pipelines
  • Ensure system scalability, resilience, and fault tolerance
  • Apply distributed systems principles:
  • Circuit breakers
  • API gateway patterns
  • Load balancing
  • Horizontal scaling
  • Saga patterns
  • Zero-downtime deployments


Read more
Redpin

at Redpin

1 candid answer
Lakshman Dornala
Posted by Lakshman Dornala
Hyderabad
5 - 10 yrs
Best in industry
skill iconPython
Data Transformation Tool (DBT)
Apache Airflow
Terraform
skill iconKubernetes
+5 more

Senior Data (Platform) Engineer 

Location: Hyderabad | Department: Technology, Data 


About the Role 


Are you passionate about building reliable, scalable data platforms that make analytics and AI development easier? As a Senior Data Platform Engineer, you will be hands-on in building, operating, and improving our core data platform and AI/LLM enablement tooling. 


You’ll focus on infrastructure, orchestration, CI/CD, and reusable frameworks that support analytics engineering and AI-driven use cases. You’ll work closely with Analytics Engineering and Insights teams and support other departments as they integrate with our data systems. 


What You'll Do 


Data Platform & Infrastructure

  • Build, deploy, and operate cloud infrastructure for data and AI workloads using Infrastructure as Code (Terraform).
  • Provision and manage cloud resources across development, staging, and production environments.
  • Develop and maintain CI/CD pipelines for data transformations, orchestration workflows, and platform services.
  • Operate and scale containerized workloads on Kubernetes, including Airflow, internal APIs, and AI/LLM services.
  • Troubleshoot and resolve infrastructure, pipeline, and orchestration failures to ensure platform reliability.
  • Maintain and support existing ML services and pipelines to ensure stability and reliability (No expectation to design or develop new ML models or training pipelines).
  • Continuously monitor and optimize platform performance and cost. 

Framework, Tooling and Enablement

  • Build and maintain reusable frameworks and patterns for dbt, Airflow, Cloud data warehouses (Snowflake, BigQuery, Redshift, Databricks, etc.), Internal data and AI APIs
  • Build and support infrastructure and pipelines for AI/LLM-based use cases, including orchestration, integration, and serving.
  • Improve developer experience for Analytics Engineering and Insights teams by reducing friction in local development, deployments, and production workflows.
  • Create and maintain technical documentation and examples to support self-service analytics and data development. 


What You’ll Need 

Technical Skills & Experience

  • 5+ years of experience in data engineering, platform engineering, or similar hands-on roles.
  • Strong programming skills in Python and SQL.
  • Hands-on experience with:
  • Terraform
  • Airflow
  • dbt
  • Kubernetes
  • Cloud platforms (AWS, Google Cloud, or Microsoft Azure)
  • CI/CD pipelines (GitHub Actions, GitLab CI, CircleCI, etc.)
  • Cloud data warehouses (Snowflake, BigQuery, Redshift, Databricks, etc.) 
  • Strong understanding of analytical data models and how analytics teams consume data.
  • Experience integrating and operating LLM-based pipelines and services (not model training). 


Soft Skills & Collaboration

  • Strong problem-solving skills and ability to debug complex platform issues.
  • Strong preference for declarative development, with the ability to clearly separate what a system should do from how it is implemented.
  • Clear communicator who can work effectively with both technical and non-technical stakeholders.
  • Pragmatic, ownership-driven mindset with a focus on reliability and simplicity. 


Why Join Us? 


We welcome people from all backgrounds who seek the opportunity to help build a future where we connect the dots for international property payments. If you have the curiosity, passion, and collaborative spirit, work with us, and let’s move the world of PropTech forward, together. 

Redpin, Currencies Direct and TorFX are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, colour, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law.

Read more
Mumbai
2 - 5 yrs
₹8L - ₹10L / yr
skill iconPHP
SQL
API
skill iconLaravel
Information architecture

Software Engineer - Lending Platform

2 - 5 years Experience · Seed Stage · On-site preferred · Mumbai


What Neenv Is

Neenv is a fintech platform building channel finance infrastructure for MSME dealer networks in India. We sit between anchor companies and their dealer ecosystems, providing the credit technology layer while lending partners provide the capital.

The platform powers four supply chain finance products: Channel Financing, Working Capital Loans, Factoring, and Supplier Financing. The lending engine is configuration-driven. New products, rate changes, new anchors, new lenders -- config changes only.


What Problems Are We Solving

India runs on dealer networks. Hundreds of thousands of distributors, resellers, and stockists sit inside large corporate supply chains - buying from anchors, selling downstream, keeping markets liquid. These are creditworthy businesses. Their anchor relationships are essentially proof of cash flow. And yet they are chronically underfinanced.

Banks are too slow. Informal credit is expensive. The anchor relationship that makes a dealer viable is invisible to traditional lenders.

We are building the infrastructure to change that. A configuration-driven lending engine for channel finance - powering working capital credit to dealer networks at scale, with the anchor relationship as the underwriting signal.


Who You'll Be Working With

The founding team brings over 50 years of combined banking and channel finance experience. Founders with 25+ years each in client coverage, trade finance, risk management, and SCF sales across Standard Chartered and IDFC First Bank - having collectively managed over $1Bn in channel finance assets with sub-1% delinquency.

The CTO brings solid supply chain finance fintech experience with a product-first, AI-native approach to lending infrastructure.

You are not joining a first-time experiment. You are joining people who have spent careers building exactly what Neenv is now automating.


What Makes Your Role

We have a production lending infrastructure in place. It handles loan origination, repayment waterfalls, interest accrual, payment processing, ledger management, and multi-product configuration. You will own this platform end to end.

Understand the codebase end to end. Drive every config change, every extension, every integration. Be the person who can answer "can the system do X?" without waiting for anyone.

That is the first act.

The second act: we are building AI-native lending workflows. A credit decisioning agent that processes bureau reports, bank statements, GST data, and ITR. A collections agent that automates follow-up and escalation. Ops agents that handle accruals, month-end, lender reporting, and anomaly detection.

You will design this architecture from day one.


What Works Well Here

Someone who gets uncomfortable when they don't fully understand a system. Who reads error logs with curiosity. Who treats financial logic correctness as non-negotiable. Who can hold a product conversation and a technical conversation in the same breath.

If you have built something non-trivial and can explain every decision you made, that is the signal.


What You Need

  • PHP and Laravel -- solid working proficiency
  • Python -- working proficiency for AI agents, data processing, integrations
  • SQL and relational database design -- financial data where a paisa-level rounding error is a production bug
  • API design and third-party integration patterns -- REST, webhooks, handling flaky vendor APIs
  • LLM and agent workflows -- curiosity or working familiarity. Strong signal if you have built with Claude, GPT, or any agent framework
  • Fintech, NBFC, or any domain where data accuracy has real consequences


What We Are Offering

Fixed salary, competitive for early-stage fintech in Mumbai. Direct founder access. Ownership over a production lending system and the AI layer being built on top. For the right fit, a clear path to owning the entire technical stack as we scale.

We cannot offer a large team, defined career ladders, or a 500-person safety net. We can offer a genuinely hard problem, speed, and the chance to build something that matters from nearly the beginning.


Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
7 - 10 yrs
Best in industry
skill iconNodeJS (Node.js)
NestJS
skill iconExpress
skill iconPostgreSQL
Microservices
+7 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.

We're seeking an experienced Senior Backend Engineer to join our team. As a senior backend engineer, you will be responsible for designing, developing, and deploying scalable backends for the products we build at NonStop. This includes APIs, databases, and server-side logic.


Responsibilities:

● Design, develop, and deploy backend systems, including APIs, databases, and server-side logic

● Write clean, efficient, and well-documented code that adheres to industry standards and best practices

● Code Quality: Ensure code quality through code reviews, adherence to best practices, and continuous improvement

● Mentorship: Guide and mentor team members, fostering growth and innovation

● Collaboration: Work closely with stakeholders to align technical goals with business objectives

● Problem-Solving: Analyze and resolve technical challenges promptly ● Innovation: Stay updated with the latest technology trends and integrate them into solutions


Requirements:

● At least 7+ years of experience building scalable and reliable backend systems

● Strong expertise in NodeJS/NestJS, Express, PostgreSQL

● Experience with microservices architecture and distributed systems

● Proficiency in database design (SQL and NoSQL)

● Knowledge of cloud platforms (AWS, Azure, or GCP) and CI/CD pipelines

● Deep understanding of design patterns, data structures, and algorithms

● Hands-on experience with containerization technologies like Docker and orchestration tools like Kubernetes

● Exceptional communication and leadership skills

● Strong understanding of object-oriented programming principles and design patterns

● Familiarity with automated testing frameworks and methodologies

● Excellent problem-solving skills and attention to detail

● Strong communication skills and ability to effectively lead and maintain a collaborative team environment

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
3 - 5 yrs
Best in industry
skill iconNodeJS (Node.js)
NestJS
skill iconExpress
skill iconPostgreSQL
SQL
+5 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment. We're seeking an experienced Backend Software Engineer to join our team. As a backend engineer, you will be responsible for designing, developing, and deploying scalable backends for the products we build at NonStop. This includes APIs, databases, and server-side logic.


Responsibilities:

● Design, develop, and deploy backend systems, including APIs, databases, and server-side logic

● Write clean, efficient, and well-documented code that adheres to industry standards and best practices

● Participate in code reviews and contribute to the improvement of the codebase

● Debug and resolve issues in the existing codebase

● Develop and execute unit tests to ensure high code quality

● Work with DevOps engineers to ensure seamless deployment of software changes

● Monitor application performance, identify bottlenecks, and optimize systems for better scalability and efficiency

● Stay up-to-date with industry trends and emerging technologies; advocate for best practices and new ideas within the team

● Collaborate with cross-functional teams to identify and prioritize project requirements


Requirements:

● At least 3+ years of experience building scalable and reliable backend systems

● Strong expertise in NodeJS/NestJS, Express, PostgreSQL

● Experience with microservices architecture and distributed systems

● Proficiency in database design (SQL and NoSQL)

● Knowledge of cloud platforms (AWS, Azure, or GCP) and CI/CD pipelines

● Deep understanding of design patterns, data structures, and algorithms

● Hands-on experience with containerization technologies like Docker and orchestration tools like Kubernetes

● Exceptional communication and leadership skills

● Strong understanding of object-oriented programming principles and design patterns

● Familiarity with automated testing frameworks and methodologies

● Excellent problem-solving skills and attention to detail

● Strong communication skills and ability to effectively lead and maintain a collaborative team environment

Read more
Healthcare Product
Chennai
6 - 10 yrs
₹16L - ₹25L / yr
skill icon.NET
ASP.NET
skill iconReact.js
SQL
Object Oriented Programming (OOPs)

Required Qualifications:

• OOPS - In-depth understanding of Object Oriented Programming principles

• Solid - In-depth knowledge and practical knowledge of applying SOLID design principles

• Architectural Patterns - In-depth understanding of design patterns and have experience in designing and building complex architecture solutions

• React + Typescript - Extensive hands-on experience in React + Typescript, building high-performance complex frontends

• Unit testing - Ability to write Unit Testing using Jest framework

• Dotnet core webapi - Extensive hands-on experience in writing WebAPIs using dotnet core. Should have advanced knowledge on middlewares, auth flow, etc.

• C# - Solid knowledge on advanced C# language features like Lambda functions, Generics, etc.

• Entity Framework - Solid knowledge on Entity framework database first and code first approach

• Unit testing - Ability to write Unit Testing using popular mock frameworks and XUnit framework

• SQL - Advanced - Extensive hands-on experience in Microsoft SQL (DDL, DML, Aggregates, Functions, Stored proc, etc)

• Query Performance Tuning - Ability to understand query plans and tune compelx queries to improve performance

• Core services - Advanced knowledge and hands-on in building applications hosted in AWS using ECS containers, API Gateway, Lambda, Postgres/DynamoDb

• IAC - Hands-on experience in writing Teraform Scripting - CI/CD, Github pipelines

Read more
Mumbai
8 - 12 yrs
₹20L - ₹25L / yr
SQL
Scripting
Active Directory
RBAC
JSON
+8 more

Role Overview

We are looking for a Saviynt-focused IAM professional at an architecture/engineering level with deep expertise in Identity Governance and Administration (IGA). The candidate will drive end-to-end Saviynt solution design, implementation, and optimization, ensuring scalable, secure, and compliant identity ecosystems across enterprise environments.

Key Responsibilities

  • Saviynt Architecture & Platform Engineering:
  • Design and implement scalable Saviynt architecture, including tenant setup, data model design, and performance optimization
  • Develop and manage advanced rules, workflows, and business logic within Saviynt
  • Drive platform customization, plugin development, and REST/API-based integrations
  • IGA Solution Design:
  • Architect and implement end-to-end IGA solutions including Access Request System (ARS), SoD (Segregation of Duties), and Certification/Recertification frameworks
  • Define RBAC models, entitlement governance strategies, and lifecycle management processes
  • Identity Integration & Ecosystem:
  • Lead integrations with enterprise applications, directories, and cloud platforms using connectors, APIs, and event-driven mechanisms
  • Work closely with cross-functional teams to enable application onboarding and automated provisioning
  • AD / Azure AD / Multi-Tenant Expertise:
  • Architect identity models across Active Directory (AD) and Azure Active Directory (AAD) environments
  • Design group structures, OU strategies, and identity lifecycle flows
  • Leverage Multi-Tenant Organization (MTO) capabilities for cross-tenant identity governance
  • Governance, Risk & Compliance:
  • Implement and optimize SoD policies, access certifications, and audit controls
  • Ensure compliance with security standards and regulatory frameworks
  • Automation & Optimization:
  • Enhance self-service capabilities, workflow automation, and access request efficiencies
  • Continuously improve performance, scalability, and operational stability of the Saviynt platform
  • Code Quality & Delivery Excellence:
  • Maintain high-quality code standards, documentation, and deployment practices
  • Support production environments, troubleshoot issues, and ensure platform reliability

Required Skills & Experience

  • 8+ years of hands-on experience in Saviynt IGA implementation and engineering
  • Strong expertise in: Saviynt EIC platform architecture & configuration; ARS, SoD, Recertification, RBAC; REST APIs, JSON, SQL, and scripting
  • Deep understanding of: Active Directory (AD) & Azure AD (AAD); Identity lifecycle management & provisioning workflows
  • Experience in enterprise integrations and large-scale deployments
  • Exposure to Multi-Tenant Organization (MTO) is a strong plus

Good to Have

  • Experience with other IAM tools (e.g., SailPoint, Okta)
  • Knowledge of cloud platforms (Azure, AWS)
  • Understanding of security frameworks (ISO, SOX, GDPR)


Read more
Nirmitee.io

at Nirmitee.io

4 recruiters
Gitashri K
Posted by Gitashri K
Pune
3 - 5 yrs
₹5L - ₹11L / yr
MERN Stack
SQL
skill iconAmazon Web Services (AWS)
Fullstack Developer
skill iconHTML/CSS
+1 more

We are looking for a highly skilled Full Stack Developer (MERN Stack) with 3–5 years of experience to join our growing team. You will have the opportunity to work on cutting-edge technology solutions, build products from scratch, and contribute to scalable systems handling large volumes of data.


Key Responsibilities:

  • Design, develop, and maintain scalable full-stack applications
  • Build responsive and high-performance user interfaces using modern frontend frameworks
  • Develop robust backend services and APIs
  • Ensure seamless system performance while handling large-scale data without slowdowns
  • Collaborate with cross-functional teams (product, design, QA) to meet business goals
  • Optimize applications for maximum speed, scalability, and reliability
  • Participate in architecture discussions and contribute to technical decisions


Required Skills & Qualifications:

Frontend

  • Strong experience in React.js
  • Hands-on experience with Next.js (mandatory)
  • Good understanding of UI/UX principles and responsive design

Backend

  • Solid experience in Node.js
  • Experience with Python or Java is a plus
  • Strong knowledge of RESTful APIs and microservices architecture

Databases

  • Strong experience with SQL (mandatory)
  • Experience with MongoDB is a plus
  • Caching & Messaging
  • Experience with at least one: Redis, Kafka, or Cassandra

Other Requirements


  • Strong problem-solving and analytical skills
  • Ability to work in a fast-paced, collaborative environment
  • Good communication and stakeholder management skills

Good to Have:

  • Cloud certifications (AWS / Azure / GCP)
  • Experience working on high-scale or distributed systems
  • Exposure to DevOps practices and CI/CD pipelines


Why Join Us:

  • Opportunity to work on cutting-edge tech and greenfield projects
  • Ownership and freedom to build solutions from scratch
  • Collaborative and growth-focused work environment
Read more
TalentXO
tabbasum shaikh
Posted by tabbasum shaikh
Pune
5 - 8 yrs
₹16L - ₹20L / yr
.NET MAUI Developer
skill iconC#
SQL
Rest API
skill iconSwift
+5 more

Responsibilities:

  • Develop and maintain cross-platform applications using .NET MAUI.
  • Convert UI/UX designs into responsive and pixel-perfect layouts using XAML.
  • Implement new features, resolve bugs, and optimize application performance across Android, iOS, Windows, and macOS.
  • Integrate iOS and Android native libraries and handle native library interop.
  • Utilize at least 1 year of experience in native Android and iOS development (Kotlin/Java, Swift/Objective-C).
  • Integrate applications with RESTful APIs, third-party services, authentication mechanisms, and local storage solutions such as SQLite.
  • Apply strong knowledge of MVVM architecture, design patterns, and SOLID principles to ensure clean, maintainable, and scalable code.
  • Collaborate closely with product managers, UI/UX designers, QA teams, and backend developers to deliver high-quality releases.
  • Uphold code quality through code reviews, unit testing, and adherence to engineering best practices.
  • Participate actively in sprint planning, technical discussions, and architectural decision-making.
  • Prepare and maintain project documentation, technical specifications, workflows, and implementation guides.
  • Mentor junior developers and provide technical leadership and guidance.



Read more
Gurugram
5 - 10 yrs
₹30L - ₹35L / yr
Software Architect
SaaS
b2b
REST API
SQL
+6 more

Role & Responsibilities

  • Design, develop, and maintain scalable full stack applications using modern backend and frontend technologies.
  • Build and maintain backend services and APIs using technologies such as C#, .NET, Java, or similar backend frameworks.
  • Develop responsive and efficient frontend applications using Angular (14+), TypeScript, and JavaScript or similar frontend framework.
  • Work on applications deployed in on-premise infrastructure environments, ensuring stability and performance.
  • Implement and optimize search capabilities using OpenSearch.
  • Design and maintain database structures using relational databases (SQL) and NoSQL databases such as MongoDB.
  • Collaborate with cross-functional teams to design, implement, test, and deploy new product features.
  • Troubleshoot issues, debug applications, and ensure high reliability and performance of the platform.
  • Participate in Agile/Scrum development processes, collaborating closely with team members throughout the development lifecycle.
  • Contribute to technical discussions, architecture decisions, and engineering best practices.

Ideal Candidate

  • Strong Full stack software engineer having on premise applications development experience
  • Mandatory (Experience 1): Must have 5+ years of experience as a Fullstack developer
  • Mandatory (Experience 2): Must have hands-on experience in developing and supporting applications deployed on on-premise infrastructure (Not cloud)
  • Mandatory (Backend): Must have strong backend development experience using technologies such as C#, .NET, Java, or similar backend frameworks
  • Mandatory (Frontend): Must have strong frontend development experience using technologies such as React, Angular, TypeScript, JavaScript or similar frontend frameworks
  • Mandatory (Core Skill): Must have exposure to OpenSearch
  • Mandatory (DB): Exposure to SQL (Relational DBs) & NoSQL databases like MongoDB
  • Mandatory (Company): B2B SaaS companies
  • Mandatory (Note 1): This is a hybrid role in Udyog Vihar, Gurgaon
  • Mandatory (Note 2): Role will convert into core team member, so need strong intent candidate
  • Preferred (Skill): Experience leading technical design discussions, mentoring engineers, and setting engineering standards or architectural guidelines


Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Remote only
5 - 15 yrs
₹18L - ₹25L / yr
PowerBI
SQL
Mobile App Development
Windows App Development
Scripting
+12 more

Description

As a Power Apps Developer, you will be at the forefront of crafting innovative, low‑code solutions that streamline business processes and empower end‑users across the organization. You will collaborate closely with functional analysts, business stakeholders, and fellow developers to translate complex requirements into intuitive, scalable applications on the Microsoft Power Platform. The role offers a dynamic environment where continuous learning is encouraged, providing access to the latest Power Apps features, Azure services, and integration techniques. You will contribute to a culture of knowledge sharing, participate in code reviews, and mentor junior team members, ensuring high‑quality deliverables that drive operational efficiency and measurable business impact.


Requirements:

  • 5–15 years of experience developing enterprise‑grade solutions using Microsoft Power Apps, Power Automate, and Power BI.
  • Strong proficiency in Canvas and Model‑Driven apps, Common Data Service (Dataverse), and integration with Azure services (e.g., Azure Functions, Logic Apps).
  • Solid understanding of relational databases, SQL, and data modeling concepts.
  • Experience with JavaScript, TypeScript, and RESTful APIs for extending Power Apps functionality.
  • Excellent problem‑solving abilities, strong communication skills, and a collaborative mindset.
  • Relevant certifications such as Microsoft Power Platform Developer Associate (PL‑400) are a plus.


Roles and Responsibilities:

  • Design, develop, and deploy custom Power Apps solutions that meet business requirements and adhere to best practices.
  • Create and maintain automated workflows using Power Automate to streamline repetitive tasks and improve efficiency.
  • Integrate Power Apps with external systems via connectors, APIs, and Azure services to ensure seamless data flow.
  • Perform performance tuning, debugging, and troubleshooting of applications to ensure optimal user experience.
  • Collaborate with business analysts and stakeholders to gather requirements, provide technical guidance, and deliver prototypes.
  • Conduct code reviews, enforce governance standards, and contribute to the development of a reusable component library.
  • Stay updated with the latest Power Platform releases, evaluate new features, and recommend adoption strategies.
  • Provide training and mentorship to junior developers and end‑users to foster platform adoption.


Must have skills

Power apps - 5 years

Microsoft Power Automate - 1 years


Nice to have skills

Canvas App Development and Scripting - 4 years

Canvas Apps Development - 4 years

SQL - 2 years

SharePoint APIs - 1 years

Power Fx - 2 years

C Sharp - 3 years

RESTful API - 2 years


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shivangi Bhattacharyya
Posted by Shivangi Bhattacharyya
Bengaluru (Bangalore)
2 - 5 yrs
Best in industry
skill iconPython
SQL
Data Visualization

Role- Data Analyst

Experience- 2 to 5 years

Location-Bangalore


Job Role-


● Experience: Minimum of 2+ years of professional experience in a data-heavy

environment (E-commerce or Fintech experience is a plus).

● SQL Mastery: Exceptional ability to write complex joins, window functions, Analytical

functions, and CTEs. Experience with high-scale databases (e.g., BigQuery, Hive, or

Postgres).

● Scripting: Functional knowledge of Python for data manipulation (Pandas, NumPy)

and basic automation scripts.

● Systems Thinking: Ability to understand upstream data flows and how they impact

downstream reporting.

● Problem-Solving: A "detective" mindset—you enjoy digging into a Rs 600Cr discrepancy until you find the root cause

Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
6 - 10 yrs
₹32L - ₹42L / yr
ETL
SQL
Google Cloud Platform (GCP)
Data engineering
ELT
+17 more

Role & Responsibilities:

We are looking for a strong Data Engineer to join our growing team. The ideal candidate brings solid ETL fundamentals, hands-on pipeline experience, and cloud platform proficiency — with a preference for GCP / BigQuery expertise.


Responsibilities:

  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows
  • Work with Dataform or DBT to implement transformation logic and data models
  • Develop and optimize data solutions on GCP (BigQuery, GCS) or AWS/Azure
  • Support data migration initiatives and data mesh architecture patterns
  • Collaborate with analysts, scientists, and business stakeholders to deliver reliable data products
  • Apply data governance and quality best practices across the data lifecycle
  • Troubleshoot pipeline issues and drive proactive monitoring and resolution


Ideal Candidate:

  • Strong Data Engineer Profile
  • Must have 6+ years of hands-on experience in Data Engineering, with strong ownership of end-to-end data pipeline development.
  • Must have strong experience in ETL/ELT pipeline design, transformation logic, and data workflow orchestration.
  • Must have hands-on experience with any one of the following: Dataform, dbt, or BigQuery, with practical exposure to data transformation, modeling, or cloud data warehousing.
  • Must have working experience on any cloud platform: GCP (preferred), AWS, or Azure, including object storage (GCS, S3, ADLS).
  • Must have strong SQL skills with experience in writing complex queries and optimizing performance.
  • Must have programming experience in Python and/or SQL for data processing.
  • Must have experience in building and maintaining scalable data pipelines and troubleshooting data issues.
  • Exposure to data migration projects and/or data mesh architecture concepts.
  • Experience with Spark / PySpark or large-scale data processing frameworks.
  • Experience working in product-based companies or data-driven environments.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.


NOTE:

  • There will be an interview drive scheduled on 28th and 29th March 2026, and if shortlisted, they will be expected to be available on these Interview dates. Only Immediate joiners are considered.
Read more
Inspiron Labs

at Inspiron Labs

1 candid answer
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
8yrs+
Upto ₹13L / yr (Varies
)
Windows Azure
SQL
PySpark
skill iconPython

Senior Data Engineer (Azure Databricks)


Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines using Azure Databricks and PySpark
  • Work extensively with PySpark notebooks within Databricks for data processing and transformation
  • Build and optimize batch data processing workflows
  • Develop and manage data integrations using Azure Functions and Logic Apps
  • Write efficient and optimized SQL queries for data extraction and transformation

Required Skills:

  • Strong hands-on experience with Azure Databricks, PySpark, and SQL
  • Experience working with batch processing frameworks
  • Proficiency in building and managing data pipelines in Azure ecosystem

Good to Have:

  • Experience with Python

Mandatory Requirement:

  • Candidate must have hands-on experience working with PySpark notebooks in Databricks


Read more
Mango Sciences
Remote only
7 - 12 yrs
₹20L - ₹40L / yr
skill iconPython
SQL
ETL
Data pipeline
Datawarehousing
+12 more

The Mission: We are looking for a visionary Technical Leader to own our healthcare data ecosystem from the first byte to the final dashboard. You won't just be managing a platform; you’ll be the primary architect of a clinical data engine that powers life-changing analytics. If you are an expert in SQL and Python who thrives on solving the "puzzle" of healthcare interoperability (FHIR/HL7) while mentoring a high-performing team, this is your seat at the table.

What You’ll Own

  • Architectural Sovereignty: Define the end-to-end blueprint for our data warehouse (staging, marts, and semantic layers). You choose the frameworks, set the coding standards, and decide how we handle complex dimensional modeling and SCDs.
  • Engineering Excellence: Lead by example. You’ll write production-grade Python for ingestion frameworks and craft advanced, set-based SQL transformations that others use as gold-standard references.
  • The Interoperability Bridge: Turn the chaos of EHR exports, REST APIs, and claims data into clean, FHIR-aligned governed datasets. You ensure our data speaks the language of modern healthcare.
  • Technical Mentorship: Act as the "Engineer’s Engineer." You’ll run design reviews, champion CI/CD best practices, and build the runbooks that keep our small but mighty team efficient.
  • Security by Design: Direct the implementation of HIPAA-compliant data flows, ensuring encryption, auditability, and access controls are baked into the architecture, not bolted on.

The Stack You’ll Command

  • Languages: Expert-level SQL (CTE, Window Functions, Tuning) and Production Python.
  • Databases: Deep polyglot experience across MSSQL, PostgreSQL, Oracle, and NoSQL (MongoDB/Elasticsearch).
  • Orchestration: Advanced Apache Airflow (SLAs, retries, and complex DAGs).
  • Ecosystem: GitHub for CI/CD, Tableau/PowerBI for semantic layers, and Unix/Linux for shell scripting.

Who You Are

  • Experienced: You have 8–12+ years in data engineering, with a significant portion spent in a Lead or Architect capacity.
  • Healthcare-Fluent: You understand the stakes of PHI. You’ve worked with FHIR/HL7 and know how to map clinical resources to analytical models.
  • Performance-Obsessed: You don’t just make it work; you make it fast. You’re the person who uses EXPLAIN/ANALYZE to shave minutes off a query.
  • Culture-Builder: You believe in documentation, observability (lineage/freshness), and "leaving the campground cleaner than you found it."

Bonus Points for:

  • Privacy Pro: Experience with PII/PHI de-identification and privacy-by-design.
  • Cloud Native: Deep familiarity with Azure, AWS, or GCP security and data services.
  • Search Experts: Experience with near-real-time indexing via Elasticsearch.

To process your resume for the next process, please fill out the Google form with your updated resume.

 

Pre-screen Question: https://forms.gle/q3CzfdSiWoXTCEZJ7

 

Details: https://forms.gle/FGgkmQvLnS8tJqo5A

Read more
Cambridge Wealth (Baker Street Fintech)
Sangeeta Bhagwat
Posted by Sangeeta Bhagwat
Pune
1 - 4 yrs
₹3L - ₹7L / yr
SQL
skill iconPython
skill iconAmazon Web Services (AWS)
Spotfire
Qlikview
+12 more

Who are we aka "About Us":

 

We are an early-stage Fintech Startup - working on exciting Fintech Products for some of the Top 5 Global Banks and building our own. If you are looking for a place where you can make a mark and not just be a cog in the wheel, Baker street Fintech Pvt Ltd (Parent Company) might be the place for you. We have a flat, ownership-oriented culture, and deliver world-class quality. You will be working with a founding team that has delivered over 26 industry-leading product experiences and won the Webby awards for Digital Strategy. In short, a bleeding edge team. 

 

As Cambridge Wealth, we are well-established in the wealth and mutual fund distribution segment, having won awards from BSE Star as well as Mutual Fund houses. Our UHNI/HNI/NRI clients include renowned professionals from various industries. 

 

What are we looking for a.k.a “The JD” :

 

We are seeking a skilled and detail-oriented Data Analyst to join our product team. As a Data Analyst, you will play a crucial role in extracting, analysing, and interpreting complex financial data to drive strategic decision-making and optimize our data solutions. The ideal candidate should possess a strong foundation in SQL / NoSQL databases, Python programming, and proficiency in tools like PostgreSQL and Excel. A deep understanding of financial concepts is also a plus. Additionally, having an interest in business intelligence tools and machine learning will be valuable for this role.

 

Responsibilities:

  • Proficient in writing complex SQL Queries
  • Utilize Python for data manipulation, analysis, and visualisation, using libraries such as pandas, matplotlib, psycopg etc.
  • Perform database optimization, indexing, and query tuning to ensure high performance.
  • Monitor and maintain data quality, troubleshoot data-related issues, and implement solutions to optimize data integrity and performance.
  • Design, configure, and maintain PostgreSQL databases
  • Set up and manage database clusters, replication, and backups for disaster recovery

 

Preferred Qualifications:

  • Intermediate-level Excel skills for data analysis and reporting.
  • Strong communication skills to present findings effectively and recommendations to both technical and non-technical stakeholders.
  • Detail-oriented mindset with a commitment to data accuracy and quality.

 

*(Only Applicants who have finished their educational commitments are requested to apply)

 

Not sure whether you should apply? Here's a quick checklist to make things easier. You are someone who:

  • Has worked (1-3 years preferably) or is looking to work specifically with an early-stage startup.
  • You are ready to be a part of a Zero To One Journey which implies that you shall be involved in building fintech products and process from the ground up.
  • You are comfortable to work in an unstructured environment with a small team where you decide what your day looks like and take initiative to take up the right piece of work, own it and work with the founding team on it.
  • This is not an environment where someone will be checking up on you every few hours. It is up to you to schedule check-ins whenever you find the need to, else we assume you are progressing well with your tasks. You will be expected to find solutions to problems and suggest improvements.
  • You want complete ownership for your role & be able to drive it the way you think is right.
  • You can be a self-starter and take ownership of deliverables to develop a consensus with the team on approach and methods and deliver to them.
  • Are looking to stick around for the long term and grow with the company.

 

Read more
Hyderabad
6 - 10 yrs
₹10L - ₹18L / yr
SQL
TeamCity
octopus
Test Automation (QA)

We have an urgent opening for a skilled and detail-oriented professional for the below role:

Quick Role Overview:

  • Role: QA Automation Engineer
  • Location: Hyderabad
  • Job Type: Full-Time
  • Experience: 6 – 10 Years
  • Notice Period: Immediate to 30 Days Preferred

Job Description:

We are looking for a highly analytical QA Engineer with strong expertise in data validation and SQL-based testing. The ideal candidate will be responsible for ensuring the quality and integrity of data-driven applications by designing effective test strategies and executing comprehensive test plans.

This role requires hands-on experience with SQL, defect management, CI/CD pipelines, and automation tools, along with the ability to understand business requirements and translate them into test scenarios.

Key Responsibilities:

  • Analyze business requirements and translate them into detailed test plans, scenarios, and test cases
  • Perform data validation and ensure data accuracy using SQL queries
  • Design, develop, and execute manual and automated test cases
  • Manage defects by logging, tracking, and ensuring timely resolution
  • Work closely with development and business teams to ensure quality deliverables
  • Maintain and produce high-quality testing documentation
  • Ensure system data integrity for daily operations
  • Participate in CI/CD processes and support release cycles
  • Collaborate using tools like Jira, TeamCity, and Octopus

Desired Skills & Competencies:

Must-Have Skills:

  • Strong analytical and problem-solving skills
  • Good understanding of Data / Data Analysis
  • Strong hands-on experience with Microsoft SQL
  • Solid understanding of SQL joins and complex queries
  • Experience with Jira for defect tracking
  • Hands-on experience with TeamCity and Octopus
  • Experience with code repository tools (preferably Bitbucket)
  • Mandatory experience with any Automation Testing tool
  • Knowledge of CI/CD pipelines

Good-to-Have:

  • Exposure to Agile methodologies
  • Experience in data-driven testing environments


Read more
Hashone Career
Madhavan I
Posted by Madhavan I
Bengaluru (Bangalore), Pune, Hyderabad, Chennai, Noida
7 - 10 yrs
₹20L - ₹35L / yr
skill iconPython
skill iconDjango
SQL

ROLE SUMMARY

The Senior Python Developer designs, builds, and improves Python and Django applications. The role includes developing end‑to‑end integrations using REST and SOAP services and delivering reliable, scalable solutions through hands‑on coding and data transformation work. The developer works closely with Business Analysts, architects, and other teams to ensure technical solutions support business needs. Key responsibilities also include improving SQL performance, taking part in code reviews, supporting DevOps workflows with Git and Azure DevOps, and helping integrate GenAI features—such as GPT models, embeddings, and agent‑based tools—into enterprise applications.

ROLE RESPONSIBILITIES

  • Design and develop Python and Django applications that are scalable, secure, and maintainable.
  • Implement UI components using CSS, Bootstrap, jQuery, or similar technologies as needed.
  • Develop integrations with internal and external systems using REST, SOAP, and WSDL‑based services.
  • Create and optimize SQL queries, database structures, and data access logic to support application features.
  • Work with Business Analysts and stakeholders to translate functional requirements into technical specifications and solutions.
  • Implement accurate data mappings and transformations in accordance with business and technical requirements.
  • Contribute to code reviews, follow established coding standards, and ensure high‑quality deliverables.
  • Support the implementation and maintenance of DevOps pipelines using Git and Azure DevOps.
  • Contribute to the integration of GenAI capabilities—including GPT models, embeddings, and agent‑based components—into enterprise applications.
  • Troubleshoot issues across the application stack and collaborate closely with peers to resolve technical challenges.

TECHNICAL QUALIFICATIONS

  • 7+ years of hands‑on experience with Python and Django, including complex application development.
  • 5+ years of experience with SQL development, optimization, and database design.
  • At least 1-2 years of applied experience with GenAI technologies (GPT models, embeddings, agents, etc.).
  • Deep expertise in application architecture, system integration, and service‑oriented design.
  • Strong experience with DevOps tools and practices, including Git, Azure DevOps, CI/CD pipelines, and automated deployments.
  • Advanced understanding of REST, SOAP, WSDL, and large‑scale service integrations.

GENERAL QUALIFICATIONS

  • Exceptional verbal and written communication skills.
  • Strong analytical, problem‑solving, and architectural reasoning abilities.
  • Demonstrated leadership experience with the ability to guide and mentor technical teams.
  • Proven ability to work effectively in fast‑paced, collaborative environments.

EDUCATION REQUIREMENTS

  • Bachelor’s degree in Computer Science, MIS, or a related field.
  • Advanced certifications in Python, cloud technologies, or GenAI are preferred but not required.

 

Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Pune, Bengaluru (Bangalore)
6 - 12 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconReact.js
skill iconJavascript
+2 more

Job Summary:

As a Java Full Stack Developer, you will design, develop, and maintain scalable backend services and frontend applications using Java (Spring Boot) and React. You will work closely with cross-functional teams to deliver high-performance and reliable systems.


Key Responsibilities:

• Develop and maintain applications using Java, Spring Boot, and React

• Design and build RESTful APIs for data-driven applications

• Work on frontend development using ReactJS

• Ensure scalability, performance, and reliability of applications

• Collaborate with QA, DevOps, and Product teams

• Participate in code reviews and technical discussions

• Troubleshoot and resolve production issues

• Mentor and guide junior developers


Required Skills & Qualifications:

• Strong experience in Java and Spring Boot

• Hands-on experience with React.js

• Experience with PostgreSQL or other relational databases

• Good understanding of data modeling and backend architecture

• Strong knowledge of OOP concepts

• Familiarity with Agile/Scrum and Git workflows

• Excellent problem-solving and communication skills


Good to Have:

• Experience with Snowflake / Databricks

• Exposure to data-driven or analytics platforms

Read more
Shopflo

at Shopflo

1 candid answer
Ariba Khan
Posted by Ariba Khan
Bengaluru (Bangalore)
2 - 4 yrs
Upto ₹30L / yr (Varies
)
skill iconJava
skill iconPython
skill iconNodeJS (Node.js)
skill iconGo Programming (Golang)
skill iconRedis
+2 more

About Shopflo

At Shopflo, we're trying to change the way consumers experience brands and businesses. Our first product was a cart and checkout platform for e-commerce, that allowed marketers to personalise discounts, rewards, and payments. We are currently also working on a new product that takes it a notch higher by unlocking enterprise-grade personalization for all consumer tech businesses.


Team & Company

Shopflo was founded by three co-founders:

  • Ankit Bansal (ex-IIT Kharagpur, Oracle, Gupshup)
  • Ishan Rakshit (ex-IIT Bombay, Parthenon, Elevation Capital)
  • Priy Ranjan (ex-IIT Madras, McKinsey, Elevation Capital)


We’re a fast-growing team of ~50 people, based in HSR Layout, Bengaluru. We raised a $3.8M seed round from Tiger Global, TQ Ventures.


 What you will do

  • Design and develop microservice that can work in a large-scale multi-tenant environment.
  • Explore design implications and work towards an appropriate balance between functionality, performance, and maintainability.
  • Working with a cross-discipline team of Design, Product, Data Science and Analytics team.
  • Deploy and maintain the application in a secured AWS environment.
  • Take ownership from the ideation phase to deployment and maintenance.
  • Active participation in the hiring process to bring world-class programmers in the team.


You should apply if you have:

  • 2-4 years of experience in server-side development
  • Strong programming skills in Java, Python, Node or Golang
  • Hands-on experience in API development and frameworks such as Spring, Node, or Django.
  • Good Understanding of SQL and NoSQL databases.
  • Experience in test-driven development. (writing unit test and API test).
  • Understanding of basic cloud computing concepts and experience in using any of the major cloud service providers(AWS/GCP/Azure).
  • Ability to build and deploy the application in a containerized environment.
  • Understanding of application logging and monitoring systems like Prometheus or Kibana.
  • B. E/B. Tech/M. E. /M. Tech/M. S. from a reputed university with a good academic record.
  • Curiosity to explore cutting-edge technologies and bake them into the products.
  • Zeal and drive to take end-to-end ownership.
Read more
suntekai
Kushi A
Posted by Kushi A
Remote only
2 - 4 yrs
₹6L - ₹8L / yr
SQL
RCA

Job Description: Product Analyst


We are looking for a Product Analyst who can operate at the intersection of product thinking, analytics, and problem solving. This is a hybrid role for someone who is comfortable working with data, understands how ecommerce businesses function, and can help uncover the "why" behind business or website performance changes.


This role is ideal for someone who enjoys solving real business problems, performing root cause analysis on live websites or running businesses, identifying actionable insights, and supporting experiments that improve outcomes. You will work with both internal teams and clients, alongside Product Managers and other analysts, to drive better decision-making through structured analysis.


What you will do

• Conduct root cause analysis (RCA) for issues affecting live websites and ecommerce businesses

• Analyze business, product, and website performance to identify trends, issues, and opportunities

• Build and maintain dashboards that help teams monitor key business and product metrics

• Perform exploratory analysis to generate insights and support decision-making

• Work with product and client teams to frame problems clearly and convert ambiguous questions into analytical investigations

• Support experimentation by helping define what should be measured, analyzing results, and identifying learnings

• Understand and work with event tracking and analytics implementations

• Collaborate with Product Managers and other team members to improve visibility into business performance

• Communicate findings clearly to both internal stakeholders and clients in a way that drives action


What we are looking for

• 2–4 years of experience in product analytics, business analytics, web analytics, or similar roles

• Strong understanding of ecommerce and web analytics

• Strong SQL skills

• Ability to go beyond data retrieval and focus on analysis, interpretation, and problem solving

• Good understanding of how to investigate changes in metrics, funnels, conversion, and business performance

• Strong structured thinking and ability to break down messy problems logically

• Comfort working across multiple business contexts and learning quickly

• Strong communication skills, especially the ability to explain insights and recommendations to clients and non-technical stakeholders

• A product mindset — someone who can think beyond reporting and connect data to user behavior, business outcomes, and next steps


Good to have

• Familiarity with Shopify

• Experience with A/B testing or experimentation analysis

• Basic understanding of attribution and broader marketing analytics

• Exposure to event tracking setup and analytics instrumentation

• Familiarity with dashboarding and analytics tools, with the ability to quickly learn new tools as needed

• Python or R exposure is a plus, but not required


Who will do well in this role

• Someone who is a strong problem solver

• Someone who enjoys finding answers in imperfect or ambiguous situations

• Someone who is curious, structured, and business-minded

• Someone who can move between product questions, business questions, and analytical investigations without getting stuck in just reporting

• Someone who is comfortable working in a client-facing environment and can present insights with clarity and confidence


Role details

• Role: Product Analyst

• Location: Remote

• Hiring location: India only

• Type: Full-time


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune
5 - 6 yrs
₹4L - ₹10L / yr
Windows Azure
skill iconPython
PySpark
ADF
databricks
+2 more

🚀 Hiring: Data Engineer ( Azure ) at Deqode

⭐ Experience: 5+ Years

📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Delhi, Banglore,

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


⭐ Hiring: Databricks Data Engineer – Lakeflow | Streaming | DBSQL | Data Intelligence

We are looking for a Databricks Data Engineer ( Azure ) to build reliable, scalable, and governed data pipelines powering analytics, operational reporting, and the Data Intelligence Layer.


🔹 Key Responsibilities

✅ Build optimized batch pipelines using Delta Lake (partitioning, OPTIMIZE, Z-ORDER, VACUUM)

✅ Implement incremental ingestion using Databricks Autoloader with schema evolution & checkpointing

✅ Develop Structured Streaming pipelines with watermarking, late data handling & restart safety

✅ Implement declarative pipelines using Lakeflow

✅ Design idempotent, replayable pipelines with safe backfills

✅ Optimize Spark workloads (AQE, skew handling, shuffle & join tuning)

✅ Build curated datasets for Databricks SQL (DBSQL), dashboards & downstream applications

✅ Package and deploy using Databricks Repos & Asset Bundles (CI/CD)

Ensure governance using Unity Catalog and embedded data quality checks


✅ Mandatory Skills (Must Have)

👉 Databricks & Delta Lake (Advanced Optimization & Performance Tuning)

👉 Structured Streaming & Autoloader Implementation

👉 Databricks SQL (DBSQL) & Data Modeling for Analytics

Read more
Public Listed - Product Based company

Public Listed - Product Based company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore)
4 - 8 yrs
₹25L - ₹70L / yr
skill iconData Science
data platforms
Data-flow analysis
Data pipelines
AI Infrastructure
+28 more

🤖 Data Scientist – Frontier AI for Data Platforms & Distributed Systems (4–8 Years)

Experience: 4–8 Years

Location: Bengaluru (On-site / Hybrid)

Company: Publicly Listed, Global Product Platform


🧠 About the Mission

We are building a Top 1% AI-Native Engineering & Data Organization — from first principles.

This is not incremental improvement.

This is a full-stack transformation of a large-scale enterprise into an AI-native data platform company.

We are re-architecting:

  • Legacy systems → AI-native architectures
  • Static pipelines → autonomous, self-healing systems
  • Data platforms → intelligent, learning systems
  • Software workflows → agentic execution layers

This is the kind of shift you would expect from companies like Google or Microsoft —

Except here, you will build it from day zero and scale it globally.


🧠 The Opportunity: This role sits at the intersection of three high-impact domains:

1. Frontier AI Systems: Large Language Models (LLMs), Small Language Models (SLMs), and Agentic AI

2. Data Platforms: Warehouses, Lakehouses, Streaming Systems, Query Engines

3. Distributed Systems: High-throughput, low-latency, multi-region infrastructure


We are building systems where:

  • Data platforms optimize themselves using ML/LLMs
  • Pipelines are autonomous, self-healing, and adaptive
  • Queries are generated, optimized, and executed intelligently
  • Infrastructure learns from usage and evolves continuously

This is: AI as the control plane for data infrastructure


🧩 What You’ll Work On

You will design and build AI-native systems deeply embedded inside data infrastructure.

1. AI-Native Data Platforms

  • Build LLM-powered interfaces:
  • Natural language → SQL / pipelines / transformations
  • Design semantic data layers:
  • Embeddings, vector search, knowledge graphs
  • Develop AI copilots:
  • For data engineers, analysts, and platform users

2. Autonomous Data Pipelines

  • Build self-healing ETL/ELT systems using AI agents
  • Create pipelines that:
  • Detect anomalies in real time
  • Automatically debug failures
  • Dynamically optimize transformations

3. Intelligent Query & Compute Optimization

  • Apply ML/LLMs to:
  • Query planning and execution
  • Cost-based optimization using learned models
  • Workload prediction and scheduling
  • Build systems that:
  • Learn from query patterns
  • Continuously improve performance and cost efficiency

4. Distributed Data + AI Infrastructure

  • Architect systems operating at:
  • Billions of events per day
  • Petabyte-scale data
  • Work with:
  • Distributed compute engines (Spark / Flink / Ray class systems)
  • Streaming systems (Kafka-class infra)
  • Vector databases and hybrid retrieval systems

5. Learning Systems & Feedback Loops

  • Build closed-loop AI systems:
  • Execution → feedback → model updates
  • Develop:
  • Continual learning pipelines
  • Online learning systems for infra optimization
  • Experimentation frameworks (A/B, bandits, eval pipelines)

6. LLM & Agentic Systems (Infra-Aware)

  • Build agents that understand data systems
  • Enable:
  • Autonomous pipeline debugging
  • Root cause analysis for infra failures
  • Intelligent orchestration of data workflows


🧠 What We’re Looking For

Core Foundations

  • Strong grounding in:
  • Machine Learning, Deep Learning, NLP
  • Statistics, optimization, probabilistic systems
  • Distributed systems fundamentals
  • Deep understanding of:
  • Transformer architectures
  • Modern LLM ecosystems

Hands-On Expertise

  • Experience building:
  • LLM / GenAI systems (RAG, fine-tuning, embeddings)
  • Data platforms (warehouse, lake, lakehouse architectures)
  • Distributed pipelines and compute systems
  • Strong programming skills:
  • Python (ML/AI stack)
  • SQL (deep understanding — query planning, optimization mindset)


Systems Thinking (Critical)

You think in systems, not components.

  • Built or worked on:
  • Large-scale data pipelines
  • High-throughput distributed systems
  • Low-latency, high-concurrency architectures
  • Understand:
  • Query optimization and execution
  • Data partitioning, indexing, caching
  • Trade-offs in distributed systems


🔥 What Sets You Apart (Top 1%)

  • Built AI-powered data platforms or infra systems in production
  • Designed or contributed to:
  • Query engines / optimizers
  • Data observability / lineage systems
  • AI-driven infra or AIOps platforms
  • Experience with:
  • Multi-modal AI (logs, metrics, traces, text)
  • Agentic AI systems
  • Autonomous infrastructure
  • Worked on systems at scale comparable to:
  • Google (BigQuery-like systems)
  • Meta (real-time analytics infra)
  • Snowflake / Databricks (lakehouse architectures)


🧬 Ideal Background (Not Mandatory)

We often see strong candidates from:

  • Data infrastructure or platform engineering teams
  • AI-first startups or research-driven environments
  • High-scale product companies

Experience building:

  • Internal platforms used by 1000s of engineers
  • Systems serving millions of users / high throughput workloads
  • Multi-region, distributed cloud systems


🧠 The Kind of Problems You’ll Solve

  • Can LLMs replace traditional query optimizers?
  • How do we build self-healing data pipelines at scale?
  • Can data systems learn from every query and improve automatically?
  • How do we embed reasoning and planning into infrastructure layers?
  • What does a fully autonomous data platform look like?


Background: We Commonly See (But Not Limited To)

Our team often includes engineers from top-tier institutions and strong research or product backgrounds, including:

  • Leading engineering schools in India and globally
  • Engineers with experience in top product companies, AI startups, or research-driven environments
  • That said, we care far more about demonstrated ability, depth, and impact than pedigree alone.


Read more
Public Listed - Product Based Company

Public Listed - Product Based Company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore)
7 - 10 yrs
₹40L - ₹75L / yr
skill iconJava
skill iconPython
skill iconGo Programming (Golang)
skill iconNodeJS (Node.js)
Database Design
+36 more

🧭 Tech Lead (Backend / Fullstack | 7–10 Years)

Location: Bangalore (On-Site, Hybrid)

Company Type: Public-Listed Product Company


We’re Building a “Top 1% Engineering Org”

We’re building a high-talent-density, AI-first R&D organization from scratch — inside a publicly listed company undergoing a full-scale transformation.

Think:

→ Rewriting legacy systems into AI-native architectures

→ Embedding LLMs + Agentic AI into core workflows

→ Reimagining platforms, infra, and data systems for the next decade

This is the kind of shift you’d expect from Google, Microsoft, or Meta —

Except you get to build it from day 0 → scale it globally.


About the Role / Team

We are building a next-generation AI-first R&D organization in Bengaluru, focused on solving complex problems across LLMs, Agentic AI systems, distributed computing, and enterprise-scale architectures.

This initiative is part of a publicly listed global company investing heavily in AI-driven transformation, re-architecting its platforms into intelligent, autonomous systems powered by large language models, workflows, and decision engines.


You will be working on:

  • Agentic AI systems & LLM-powered workflows
  • Distributed, scalable backend systems
  • Enterprise-grade AI platforms
  • Automation-first engineering environments

🚀 The Mandate

Lead execution of mission-critical systems while staying hands-on — bridging architecture and delivery.


🧩 What You’ll Do

  • Own end-to-end delivery of complex engineering initiatives (0→1, 1→N)
  • Design systems across backend + frontend (if fullstack)
  • Translate ambiguous problems into structured technical solutions
  • Drive engineering best practices, code quality, and velocity
  • Mentor engineers and elevate team performance
  • Collaborate with stakeholders on roadmap and execution strategy


🧠 What We’re Looking For

  • Strong experience in backend systems + optional frontend frameworks
  • Proven ability to lead projects and deliver at scale
  • Solid understanding of system design and architecture patterns
  • Ability to balance speed vs quality vs scalability trade-offs
  • Strong communication and leadership without authority
  • Strong coding skills in Python / Java / Go / Node.js
  • Solid understanding of data structures, system design basics, and backend architecture
  • Experience building scalable APIs and services
  • Familiarity or curiosity around AI/LLMs, async systems, or event-driven design
  • Strong debugging, problem-solving, and ownership mindset


Nice to Have

  • Experience integrating LLMs, vector databases, or AI pipelines
  • Contributions to architecture at scale
  • Experience with Agentic AI / LLM orchestration frameworks
  • Background in product engineering or platform companies
  • Exposure to global-scale systems (millions of users / high throughput)


🔥 What Sets You Apart

  • Experience leading platform builds or major system rewrites
  • Exposure to AI systems, LLM integrations, or intelligent workflows
  • Built platforms used by millions of users / high-throughput systems
  • Experience with event-driven systems, stream processing, or infra platforms
  • Prior work on AI/ML platforms, model serving, or intelligent systems


Background: We Commonly See (But Not Limited To)

  • Our team often includes engineers from top-tier institutions and strong research or product company or DeepTech or AI Product backgrounds, including:
  • Leading engineering schools in India and globally
  • Engineers with experience in top product companies, AI startups, or research-driven environments
  • That said, we care far more about demonstrated ability, depth, and impact than pedigree alone.


Read more
SAAS Industry

SAAS Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹25L / yr
skill iconAmazon Web Services (AWS)
skill iconNodeJS (Node.js)
RESTful APIs
NOSQL Databases
Systems design
+39 more

Job Details

Job Title: Senior Backend Engineer

Industry: SAAS

Function – Information Technology

Experience Required: 5-8 years

- Working Days: 6 days a week, (5 days-in-office, Saturdays WFH)

Employment Type: Full Time

Job Location: Bangalore

CTC Range: Best in Industry

 

Preferred Skills: AWS, NodeJS, RESTful APIs, NoSQL

 

Criteria

· Minimum 5+ years in backend engineering with strong system design expertise

· Experience building scalable systems from scratch

· Expert-level proficiency in Node.js

· Deep understanding of distributed systems

· Strong NoSQL design skills

· Hands-on AWS cloud experience

· Proven leadership and mentoring capability

· Preferred candidates from SAAS/Software/IT Services based startups or scaleup companies

 

Job Description

The Role:

What You’ll Build:

1. System Architecture & Design

● Architect highly scalable backend systems from the ground up

● Define technology choices: frameworks, databases, queues, caching layers

● Evaluate microservices vs monoliths based on product stage

● Design REST, GraphQL, and real-time WebSocket APIs

● Build event-driven systems for asynchronous processing

● Architect multi-tenant systems with strict data isolation

● Maintain architectural documentation and technical specs

2. Core Backend Services

● Build high-performance APIs for 3D content, XR experiences, analytics, and user interactions

● Create 3D asset processing pipelines for uploads, conversions, and optimization

● Develop distributed job workers for CPU/GPU-intensive tasks

● Build authentication/authorization systems (RBAC)

● Implement billing, subscription, and usage metering

● Build secure webhook systems and third-party integration APIs

● Create real-time collaboration features via WebSockets/SSE

3. Data Architecture & Databases

● Design scalable schemas for 3D metadata, XR sessions, and analytics

● Model complex product catalogs with variants and hierarchies

● Implement Redis-based caching strategies

● Build search and indexing systems (Elasticsearch/Algolia)

● Architect ETL pipelines and data warehouses

● Implement sharding, partitioning, and replication strategies

● Design backup, restore, and disaster recovery workflows

4. Scalability & Performance

● Build systems designed for 10x–100x traffic growth

● Implement load balancing, autoscaling, and distributed processing

● Optimize API response times and database performance

● Implement global CDN delivery for heavy 3D assets

● Build rate limiting, throttling, and backpressure mechanisms

● Optimize storage and retrieval of large 3D files

● Profile and improve CPU, memory, and network performance

5. Infrastructure & DevOps

● Architect AWS infrastructure (EC2, S3, Lambda, RDS, ElastiCache)

● Build CI/CD pipelines for automated deployments and rollbacks

● Use IaC tools (Terraform/CloudFormation) for infra provisioning

● Set up monitoring, logging, and alerting systems

● Use Docker + Kubernetes for container orchestration

● Implement security best practices for data, networks, and secrets

● Define disaster recovery and business continuity plans

6. Integration & APIs

● Build integrations with Shopify, WooCommerce, Magento

● Design webhook systems for real-time events

● Build SDKs, client libraries, and developer tools

● Integrate payment gateways (Stripe, Razorpay)

● Implement SSO and OAuth for enterprise customers

● Define API versioning and lifecycle/deprecation strategies

7. Data Processing & Analytics

● Build analytics pipelines for engagement, conversions, and XR performance

● Process high-volume event streams at scale

● Build data warehouses for BI and reporting

● Develop real-time dashboards and insights systems

● Implement analytics export pipelines and platform integrations

● Enable A/B testing and experimentation frameworks

● Build personalization and recommendation systems

 

Technical Stack:

1. Backend Languages & Frameworks 

●  Primary: Node.js (Express, NestJS), Python (FastAPI, Django)

●  Secondary: Go, Java/Kotlin (Spring)

●  APIs: REST, GraphQL, gRPC


2. Databases & Storage

● SQL: PostgreSQL, MySQL

● NoSQL: MongoDB, DynamoDB

● Caching: Redis, Memcached

● Search: Elasticsearch, Algolia

● Storage/CDN: AWS S3, CloudFront

● Queues: Kafka, RabbitMQ, AWS SQS

 

3. Cloud & Infrastructure: 

● Cloud: AWS (primary), GCP/Azure (nice to have)

● Compute: EC2, Lambda, ECS, EKS

● Infrastructure: Terraform, CloudFormation

● CI/CD: GitHub Actions, Jenkins, CircleCI

● Containers: Docker, Kubernetes

 

4. Monitoring & Operations 

● Monitoring: Datadog, New Relic, CloudWatch

● Logging: ELK Stack, CloudWatch Logs

● Error Tracking: Sentry, Rollbar

● APM tools

 

5. Security & Auth

● Auth: JWT, OAuth 2.0, SAML

● Secrets: AWS Secrets Manager, Vault

● Security: Encryption (at rest/in transit), TLS/SSL, IAM

 


What We’re Looking For:

1. Must-Haves

● 5+ years in backend engineering with strong system design expertise

● Experience building scalable systems from scratch

● Expert-level proficiency in at least one backend stack (Node, Python, Go, Java)

● Deep understanding of distributed systems and microservices

● Strong SQL/NoSQL design skills with performance optimization

● Hands-on AWS cloud experience

● Ability to write high-quality production code daily

● Experience building and scaling RESTful APIs

● Strong understanding of caching, sharding, horizontal scaling

● Solid security and best-practice implementation experience

● Proven leadership and mentoring capability


2. Highly Desirable

● Experience with large file processing (3D, video, images)

● Background in SaaS, multi-tenancy, or e-commerce

● Experience with real-time systems (WebSockets, streams)

● Knowledge of ML/AI infrastructure

● Experience with HA systems, DR planning

● Familiarity with GraphQL, gRPC, event-driven systems

● DevOps/infrastructure engineering background

● Experience with XR/AR/VR backend systems

● Open-source contributions or technical writing

● Prior senior technical leadership experience

 

Technical Challenges You’ll Solve:

● Designing large-scale 3D asset processing pipelines

● Serving XR content globally with ultra-low latency

● Scaling from thousands to millions of daily requests

● Efficiently handling CPU/GPU-heavy workloads

● Architecting multi-tenancy with complete data isolation

● Managing billions of analytics events at scale

● Building future-proof APIs with backward compatibility

 

Why company:

● Architectural Ownership: Build foundational systems from scratch

● Deep Technical Work: Solve distributed systems and scaling challenges

● Hands-On Impact: Design and code mission-critical infrastructure

● Diverse Problems: APIs, infra, data, ML, XR, asset processing

● Massive Scale Opportunity: Build systems for exponential growth

● Modern Stack and best practices

● Product Impact: Your architecture directly powers millions of users

● Leadership Opportunity: Shape engineering culture and direction

● Learning Environment: Stay at the forefront of backend engineering

● Backed by AWS, Microsoft, Google

 

Location & Work Culture:

● Location: Bengaluru

● Schedule: 6 days a week, (5 days-in-office, Saturdays WFH)

● Culture: Builder mindset, strong ownership, technical excellence

● Team: Small, highly skilled backend and infra team

● Resources: AWS credits, latest tooling, learning budget

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort