Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
TalentXO
tabbasum shaikh
Posted by tabbasum shaikh
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹30L / yr
Backend Development
skill iconPython
skill iconJava
SQL


Role & Responsibilities

As a Founding Engineer, you'll join the engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.

This role is ideal for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems require creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.

Key Responsibilities

  • Build core platform features: Develop robust APIs, services, and integrations that power billing automation and revenue recognition capabilities.
  • Work across the full stack: Contribute to backend services and frontend interfaces to ensure seamless user experiences.
  • Implement critical integrations: Connect the platform with external systems including CRMs, data warehouses, ERPs, and payment processors.
  • Optimize for scale: Design systems that handle complex pricing models, high-volume usage data, and real-time financial calculations.
  • Drive quality and best practices: Write clean, maintainable code and participate in code reviews and architectural discussions.
  • Solve complex problems: Debug issues across the stack and collaborate with cross-functional teams to address evolving client needs.

The Impact You'll Make

  • Power business growth: Enable fast-growing B2B companies to scale billing and revenue operations efficiently.
  • Build critical financial infrastructure: Contribute to systems handling high-value transactions with accuracy and compliance.
  • Shape product direction: Join during a scaling phase where your contributions directly impact product evolution and customer success.
  • Accelerate your expertise: Gain deep exposure to financial systems, B2B SaaS operations, and enterprise-grade software development.
  • Drive the future of B2B commerce: Help build infrastructure supporting next-generation pricing models, from usage-based to value-based billing.

Ideal Candidate Profile

Experience

  • 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems.
  • Strong backend development experience using one or more frameworks: FastAPI / Django (Python), Spring (Java), or Express (Node.js).
  • Deep understanding of relevant libraries, tools, and best practices within the chosen backend framework.
  • Strong experience with databases (SQL & NoSQL), including efficient data modeling and performance optimization.
  • Proven experience designing, building, and maintaining APIs, services, and backend systems with solid system design and clean code practices.

Domain

  • Experience with financial systems, billing platforms, or fintech applications is highly preferred.

Company Background

  • Experience working in product companies or startups (preferably Series A to Series D).

Education

  • Candidates from Tier 1 engineering institutes (IITs, BITS, etc.) are highly preferred.



Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Thiruvananthapuram
9 - 12 yrs
₹21L - ₹27L / yr
skill iconJava
Spring
Apache Kafka
SQL
skill iconPostgreSQL
+16 more

JOB DETAILS:

Job Title: Java Lead-Java, MS, Kafka-TVM - Java (Core & Enterprise), Spring/Micronaut, Kafka

Industry: Global Digital Transformation Solutions Provider

Salary: Best in Industry

Experience: 9 to 12 years

Location: Trivandrum, Thiruvananthapuram

 

Job Description

Experience

  • 9+ years of experience in Java-based backend application development
  • Proven experience building and maintaining enterprise-grade, scalable applications
  • Hands-on experience working with microservices and event-driven architectures
  • Experience working in Agile and DevOps-driven development environments

 

Mandatory Skills

  • Advanced proficiency in core Java and enterprise Java concepts
  • Strong hands-on experience with Spring Framework and/or Micronaut for building scalable backend applications
  • Strong expertise in SQL, including database design, query optimization, and performance tuning
  • Hands-on experience with PostgreSQL or other relational database management systems
  • Strong experience with Kafka or similar event-driven messaging and streaming platforms
  • Practical knowledge of CI/CD pipelines using GitLab
  • Experience with Jenkins for build automation and deployment processes
  • Strong understanding of GitLab for source code management and DevOps workflows

 

Responsibilities

  • Design, develop, and maintain robust, scalable, and high-performance backend solutions
  • Develop and deploy microservices using Spring or Micronaut frameworks
  • Implement and integrate event-driven systems using Kafka
  • Optimize SQL queries and manage PostgreSQL databases for performance and reliability
  • Build, implement, and maintain CI/CD pipelines using GitLab and Jenkins
  • Collaborate with cross-functional teams including product, QA, and DevOps to deliver high-quality software solutions
  • Ensure code quality through best practices, reviews, and automated testing

 

Good-to-Have Skills

  • Strong problem-solving and analytical abilities
  • Experience working with Agile development methodologies such as Scrum or Kanban
  • Exposure to cloud platforms such as AWS, Azure, or GCP
  • Familiarity with containerization and orchestration tools such as Docker or Kubernetes

 

Skills: java, spring boot, kafka development, cicd, postgresql, gitlab

 

Must-Haves

Java Backend (9+ years), Spring Framework/Micronaut, SQL/PostgreSQL, Kafka, CI/CD (GitLab/Jenkins)

Advanced proficiency in core Java and enterprise Java concepts

Strong hands-oacn experience with Spring Framework and/or Micronaut for building scalable backend applications

Strong expertise in SQL, including database design, query optimization, and performance tuning

Hands-on experience with PostgreSQL or other relational database management systems

Strong experience with Kafka or similar event-driven messaging and streaming platforms

Practical knowledge of CI/CD pipelines using GitLab

Experience with Jenkins for build automation and deployment processes

Strong understanding of GitLab for source code management and DevOps workflows

 

 

*******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: only Trivandrum

F2F Interview on 21st Feb 2026

 

Read more
Remote only
3 - 6 yrs
₹4L - ₹7L / yr
skill iconNodeJS (Node.js)
skill iconPHP
skill iconReact Native
SQL
skill iconJavascript
+6 more

Software Developer (Node.js / PHP / React Native)

Experience: 3+ Years

Employment Type: Full-Time


Role Summary


We are looking for a skilled software developer with 3+ years of experience to work on enterprise platforms in EdTech, HRMS, CRM, and online examination systems. The role involves developing scalable web and mobile applications used by institutions and organizations.


Key Responsibilities

• Develop and maintain backend services using Node.js and PHP.

• Build and enhance mobile applications using React Native.

• Design and integrate REST APIs and third-party services.

• Work with databases (MySQL/PostgreSQL) for performance-driven applications.

• Collaborate with product, QA, and implementation teams for feature delivery.

• Troubleshoot, optimize, and ensure secure, high-performance systems.


Required Skills

• Strong experience in Node.js, PHP, and React Native.

• Good knowledge of JavaScript, API development, and database design.

• Experience with Git, version control, and deployment processes.

• Understanding of SaaS-based applications and modular architecture.


Preferred

• Experience in ERP, HRMS, CRM, or education/examination platforms.

• Familiarity with cloud environments and scalable deployments.


Qualification: B.Tech / MCA / BCA / Equivalent


Read more
CNV Labs India Pvt Ltd iCloudEMS
Shital ICloudEMS
Posted by Shital ICloudEMS
Remote only
3 - 5 yrs
₹3L - ₹5L / yr
skill iconPHP
SQL
skill iconNodeJS (Node.js)
skill iconReact Native
edtech

Role Summary


We are looking for a skilled Software Developer with 3+ years of experience to work on enterprise platforms in EdTech, HRMS, CRM, and Online Examination Systems. The role involves developing scalable web and mobile applications used by institutions and organizations.


Key Responsibilities

• Develop and maintain backend services using Node.js and PHP.

• Build and enhance mobile applications using React Native.

• Design and integrate REST APIs and third-party services.

• Work with databases (MySQL/PostgreSQL) for performance-driven applications.

• Collaborate with product, QA, and implementation teams for feature delivery.

• Troubleshoot, optimize, and ensure secure, high-performance systems.


Required Skills

• Strong experience in Node.js, PHP, and React Native.

• Good knowledge of JavaScript, API development, and database design.

• Experience with Git, version control, and deployment processes.

• Understanding of SaaS-based applications and modular architecture.


Preferred

• Experience in ERP, HRMS, CRM, or Education/Examination platforms.

• Familiarity with cloud environments and scalable deployments.


Qualification: B.Tech / MCA / BCA / Equivalent

Apply: Share your resume with project details and current CTC.

Read more
CNV Labs India Pvt Ltd iCloudEMS
Shital ICloudEMS
Posted by Shital ICloudEMS
Remote only
4 - 8 yrs
₹4L - ₹8L / yr
skill iconPHP
skill iconNodeJS (Node.js)
skill iconReact Native
SQL

We are looking for a skilled Node.js Developer with Rect Native experience to build, enhance, and maintain ERP and EdTech platforms. The role involves developing scalable backend services, integrating ERP modules, and supporting education-focused systems such as LMS, student management, exams, and fee management.


Key Responsibilities


Develop and maintain backend services using Node.js,Rect Native,PHP.


Build and integrate ERP modules for EdTech platforms (Admissions, Students, Exams, Attendance, Fees, Reports).


Design and consume RESTful APIs and third-party integrations (payment gateway, SMS, email).


Work with databases (MySQL / MongoDB / PostgreSQL) for high-volume education data.


Optimize application performance, scalability, and security.


Collaborate with frontend, QA, and product teams.


Debug, troubleshoot, and provide production support.


Required Skills


Strong experience in Node.js (Express.js / NestJS).


Working experience in PHP (Core PHP / Laravel / CodeIgniter).


Hands-on experience with ERP systems.


Domain experience in EdTech / Education ERP / LMS.


Strong knowledge of MySQL and database design.


Experience with authentication, role-based access, and reporting.


Familiarity with Git, APIs, and server environments.


Preferred Skills


Experience with online examination systems.


Knowledge of cloud platforms (AWS / Azure).


Understanding of security best practices (CSRF, XSS, SQL Injection).


Exposure to microservices or modular architecture.


Qualification


Bachelor’s degree in Computer Science or equivalent experience.


3–6 years of relevant experience in Node.js & PHP development


Skills:- NodeJS (Node.js), PHP, ERP management, EdTech, MySQL, API and Amazon Web Services (AWS)



Read more
Cansvolution
Chayan Bajaj
Posted by Chayan Bajaj
Indore
1 - 2 yrs
₹3L - ₹5L / yr
skill iconGo Programming (Golang)
RESTful APIs
SQL
MySQL
skill iconPostgreSQL
+4 more

 Hiring: Golang Developer (1–2 Years Experience)

Location: Indore (On-site)

Company: Cansvolution Pvt. Ltd.

Experience: 1–2 Years

Employment Type: Full-time

About the Role

We are looking for a passionate Golang Developer with 1–2 years of hands-on experience in backend development. The ideal candidate should have strong fundamentals in Go, APIs, and database handling, and should be comfortable working in a fast-paced startup environment. Key Responsibilities

  • Develop, test, and maintain backend services using Golang
  • Design and build RESTful APIs
  • Work with databases like MySQL / PostgreSQL / MongoDB
  • Implement scalable and high-performance applications
  • Collaborate with frontend and product teams
  • Debug, optimize, and improve existing codebase
  • Write clean, maintainable, and efficient code

Required Skills

  • 1–2 years of hands-on experience in Golang
  • Strong understanding of REST APIs
  • Experience with Gin / Echo / Fiber frameworks (any one preferred)
  • Knowledge of SQL/NoSQL databases
  • Understanding of microservices architecture
  • Familiarity with Git
  • Basic understanding of Docker (preferred)

Good to Have

  • Experience with cloud platforms (AWS/GCP/Azure)
  • Understanding of CI/CD pipelines
  • Knowledge of message queues (Kafka/RabbitMQ)


Read more
CloudThat

at CloudThat

1 recruiter
shubhangi shrivastava
Posted by shubhangi shrivastava
Bengaluru (Bangalore)
3 - 6 yrs
₹7L - ₹10L / yr
skill iconHTML/CSS
skill iconPython
skill iconJava
SQL
skill iconC++
+2 more

About CloudThat:-

At CloudThat, we are driven by our mission to empower professionals and businesses to harness the full potential of cloud technologies. As a leader in cloud training and consulting services in India, our core values guide every decision we make and every customer interaction we have.


Role Overview:-

We are looking for a passionate and experienced Technical Trainer to join our expert team and help drive knowledge adoption across our customers, partners, and internal teams.


Key Responsibilities:

• Deliver high-quality, engaging technical training sessions both in-person and virtually to customers, partners, and internal teams.

• Design and develop training content, labs, and assessments based on business and technology requirements.

• Collaborate with internal and external SMEs to draft course proposals aligned with customer needs and current market trends.

• Assist in training and onboarding of other trainers and subject matter experts to ensure quality delivery of training programs.

• Create immersive lab-based sessions using diagrams, real-world scenarios, videos, and interactive exercises.

• Develop instructor guides, certification frameworks, learner assessments, and delivery aids to support end-to-end training delivery.

• Integrate hands-on project-based learning into courses to simulate practical environments and deepen understanding.

• Support the interpersonal and facilitation aspects of training fostering an inclusive, engaging, and productive learning environment


Skills & Qualifications:

• Experience developing content for professional certifications or enterprise skilling programs.

• Familiarity with emerging technology areas such as cloud computing, AI/ML, DevOps, or data engineering.


Technical Competencies:

  • Expertise in languages like C, C++, Python, Java
  • Understanding of algorithms and data structures 
  • Expertise on SQL 

Or Directly Apply-https://cloudthat.keka.com/careers/jobdetails/95441


Read more
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹4L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
TypeScript
skill iconJavascript
MySQL
+3 more

 Job Overview:

We are looking for a skilled Full Stack Developer with strong experience in Nextjs, Node.js, and React.js. The ideal candidate should be capable of building scalable web applications, leading modules, and contributing to both frontend and backend development

Key Responsibilities:

  • Design, develop, and maintain full-stack applications using Next.js, Node.js and React.js
  • Write clean, maintainable, and scalable code
  • Collaborate with cross-functional teams to define, design, and ship new features
  • Optimize applications for performance, scalability, and security
  • Mentor junior developers and conduct code reviews

Required Skills:

  • 4+ years of experience with Nextjs, React.js and Node.js
  • Strong knowledge of JavaScript, HTML, CSS
  • Experience with REST APIs, MongoDB, or SQL
  • Familiarity with version control (Git) and CI/CD tools


Why Join Us?

  • Career Advancement Opportunities and professional growth.
  • Supportive work environment with learning opportunities


Read more
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹13L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconNextJs (Next.js)
TypeScript
RESTful APIs
+3 more

Job Overview:

We are looking for a skilled Full Stack Developer with strong experience in Nextjs, Node.js, and React.js. The ideal candidate should be capable of building scalable web applications, leading modules, and contributing to both frontend and backend development

Key Responsibilities:

  • Design, develop, and maintain full-stack applications using Next.js, Node.js and React.js
  • Write clean, maintainable, and scalable code
  • Collaborate with cross-functional teams to define, design, and ship new features
  • Optimize applications for performance, scalability, and security
  • Mentor junior developers and conduct code reviews

Required Skills:

  • 4+ years of experience with Nextjs, React.js and Node.js
  • Strong knowledge of JavaScript, HTML, CSS
  • Experience with REST APIs, MongoDB, or SQL
  • Familiarity with version control (Git) and CI/CD tools


Why Join Us?

  • Career Advancement Opportunities and professional growth.
  • Supportive work environment with learning opportunities


Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 17 yrs
₹34L - ₹45L / yr
Dremio
Data engineering
Business Intelligence (BI)
Tableau
PowerBI
+51 more

Review Criteria:

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Role & Responsibilities:

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Mango Sciences
Remote only
5 - 7 yrs
₹10L - ₹15L / yr
skill iconPython
SQL
SQL quires

Database Programmer / Developer (SQL, Python, Healthcare)

Job Summary

We are seeking a skilled and experienced Database Programmer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining our database systems, with a strong focus on data integrity, performance, and security. The role requires expertise in SQL, strong programming skills in Python, and prior experience working within the healthcare domain to handle sensitive data and complex regulatory requirements.

Key Responsibilities

  • Design, implement, and maintain scalable and efficient database schemas and systems.
  • Develop and optimize complex SQL queries, stored procedures, and triggers for data manipulation and reporting.
  • Write and maintain Python scripts to automate data pipelines, ETL processes, and database tasks.
  • Collaborate with data analysts, software developers, and other stakeholders to understand data requirements and deliver robust solutions.
  • Ensure data quality, integrity, and security, adhering to industry standards and regulations such as HIPAA.
  • Troubleshoot and resolve database performance issues, including query tuning and indexing.
  • Create and maintain technical documentation for database architecture, processes, and applications.

Required Qualifications

  • Experience:
  • Proven experience as a Database Programmer, SQL Developer, or a similar role.
  • Demonstrable experience working with database systems, including data modeling and design.
  • Strong background in developing and maintaining applications and scripts using Python.
  • Direct experience within the healthcare domain is mandatory, including familiarity with medical data (e.g., patient records, claims data) and related regulatory compliance (e.g., HIPAA).
  • Technical Skills:
  • Expert-level proficiency in Structured Query Language (SQL) and relational databases (e.g., SQL Server, PostgreSQL, MySQL).
  • Solid programming skills in Python, including experience with relevant libraries for data handling (e.g., Pandas, SQLAlchemy).
  • Experience with data warehousing concepts and ETL (Extract, Transform, Load) processes.
  • Familiarity with version control systems, such as Git.

Preferred Qualifications

  • Experience with NoSQL databases (e.g., MongoDB, Cassandra).
  • Knowledge of cloud-based data platforms (e.g., AWS, GCP, Azure).
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Familiarity with other programming languages relevant to data science or application development.

Education

  • Bachelor’s degree in computer science, Information Technology, or a related field.

 

To process your resume for the next process, please fill out the Google form with your updated resume.


https://forms.gle/f7zgYAa632ww5Teb6

Read more
Remote only
2 - 4 yrs
₹3L - ₹4L / yr
skill icon.NET
SQL
skill iconPostgreSQL
RESTful APIs
skill iconGit
+4 more

We are looking for a highly skilled Full Stack Developer to design and scale our real-time vehicle tracking platform. You will be responsible for building high-performance web applications that process live GPS data and visualize it through interactive map interfaces.

Key Responsibilities

Real-Time Data Processing: Develop robust back-end services to ingest and process high-frequency GPS data from IoT devices.

Map Integration: Design and implement interactive map interfaces using tools like Google Maps API or Mapbox for real-time asset visualization.

Geofencing & Alerts: Build server-side logic for complex geospatial features, including geofencing, route optimization, and automated speed/entry alerts.

API Development: Create and maintain scalable RESTful or GraphQL APIs to bridge communication between vehicle hardware, the database, and the user dashboard.

Database Management: Architect and optimize databases (e.g., PostgreSQL with PostGIS) for efficient storage and querying of spatial-temporal data.

Performance Optimization: Ensure high availability and low-latency response times for tracking thousands of simultaneous vehicle connections.

Required Technical Skills

Front-End: Proficiency in React.js, Angular, or Vue.js, with experience in state management (Redux/MobX).

Back-End: Strong experience in Node.js (Express/NestJS), Python (Django/Flask), or Java (Spring Boot).

Mapping: Hands-on experience with Google Maps SDK, Leaflet, or OpenLayers.

Real-time Communication: Expertise in WebSockets or Socket.IO for live data streaming.

Databases: Proficiency in SQL (PostgreSQL/MySQL) and NoSQL (MongoDB/Redis) for caching.

Cloud & DevOps: Familiarity with AWS (EC2, Lambda), Docker, and Kubernetes for scalable deployment.

Qualifications

Education: Bachelor’s or Master’s degree in Computer Science or a related field.

Experience: 3–6+ years of professional full-stack development experience.

Niche Knowledge: Prior experience with telematics, IoT protocols (MQTT, HTTP), or GPS-based applications is highly preferred.

Read more
Performio

Performio

Agency job
via maple green services by Elvin Johnson
Remote only
4 - 6 yrs
₹15L - ₹20L / yr
ETL
SQL

The Opportunity:


As a Technical Support Consultant, you will play a significant role in Performio providing world

class support to our customers. With our tried and tested onboarding process, you will soon

become familiar with the Performio product and company.

You will draw on previous support experience to monitor for new support requests in

Zendesk, provide initial triage with 1st and 2nd level support, ensuring the customer is kept up

to date and the request is completed within a timely manner.

You will collaborate with other teams to ensure more complex requests are managed

efficiently and will provide feedback to help improve product and solution knowledge as well

as processes.

Answers to questions asked by customers that are not in the knowledge base will be

reviewed and added to the knowledge base if appropriate. We’re looking for someone who

thinks ahead, recognising opportunities to help customers help themselves.

You will help out with configuration changes and testing, furthering your knowledge and

experience of Performio. You may also be expected to help out with Managed Service,

Implementation and Work Order related tasks from time to time.


About Performio:


Performio is the last ICM software you’ll ever need. It allows you to manage incentive

compensation complexity and change over the long run by combining a structured plan

builder and flexible data management, with a partner who will make you a customer for life.

Our people are highly-motivated and engaged professionals with a clear set of values and

behaviors. We prove these values matter to us by living them each day. This makes Performio

both a great place to work and a great company to do business with.

But a great team alone is not sufficient to win. We have solved the fundamental issue

widespread in our industry—overly-rigid applications that cannot adapt to your needs, or

overly-flexible ones that become impossible to maintain over time. Only Performio allows you

to manage incentive compensation complexity and change over the long run by combining a

structured plan builder and flexible data management. The component-based plan builder

makes it easier to understand, change, and self-manage than traditional formula or

rules-based solutions. Our ability to Import data from any source, in any format, and perform

in-app data transformations, eliminate the pain of external processing and provides

end-to-end data visibility. The combination of these two functions, allows us to deliver more


powerful reporting and insights. And while every vendor says they are a partner, we truly are

one. We not only get your implementation right the first time, we enable you and give you the

autonomy and control to make changes year after year. And unlike most, we support every

part of your unique configuration. Performio is a partner that will make you a customer for life.

We have a global customer base across Australia, Asia, Europe, and the US in 25+ industries

that includes many well-known companies like Toll Brothers, Abbott Labs, News Corp,

Johnson & Johnson, Nikon, and Uber Freight.


What will you be doing:


● Monitoring and triaging new Support requests submitted by customers using our

Zendesk Support Portal

● Providing 1st and 2nd line support for Support requests

● Investigate, reproduce and resolve Customer issues within the required Service Level

Agreements

● Maintain our evolving knowledge base

● Clear and concise documentation of root causes and resolution

● Assist with the implementation and testing of Change Requests and implementation

projects

● As your knowledge of the product grows, make recommendations for solutions based

on client’s requests

● Assist in educating our client's compensation administrators applying best practices


What we’re looking for:


● Passion for customer service with a communication style that can be adapted to suit

the audience

● A problem solver with a range of troubleshooting methodologies

● Experience in the Sales Compensation industry

● Familiar with basic database concepts, spreadsheets and experienced in working with

large datasets (Excel, Relational Database Tables, SQL, ETL or other types of

tools/languages)

● 4+ years of experience in a similar role (experience with ICM software preferred)

● Experience with implementation & support of ICM solutions like SAP Commissions,

Varicent, Xactly will be a big plus

● Positive Attitude - optimistic, cares deeply about company and customers

● High Emotional IQ - shows empathy, listens when appropriate, creates healthy

conversation dynamic

● Resourceful - has a "I'll figure it out" attitude if something they need doesn't exist

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
5 - 8 yrs
₹11L - ₹20L / yr
PySpark
Apache Kafka
Data architecture
skill iconAmazon Web Services (AWS)
EMR
+32 more

JOB DETAILS:

* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka

* Industry: Global digital transformation solutions provider

* Salary: Best in Industry

* Experience: 5-8 years

* Location: Hyderabad

 

Job Summary

We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.


Key Responsibilities

ETL Pipeline Development & Optimization

  • Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
  • Optimize data pipelines for performance, scalability, fault tolerance, and reliability.

Big Data Processing

  • Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
  • Ensure fault-tolerant, scalable, and high-performance data processing systems.

Cloud Infrastructure Development

  • Build and manage scalable, cloud-native data infrastructure on AWS.
  • Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.

Real-Time & Batch Data Integration

  • Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
  • Ensure consistency, data quality, and a unified view across multiple data sources and formats.

Data Analysis & Insights

  • Partner with business teams and data scientists to understand data requirements.
  • Perform in-depth data analysis to identify trends, patterns, and anomalies.
  • Deliver high-quality datasets and present actionable insights to stakeholders.

CI/CD & Automation

  • Implement and maintain CI/CD pipelines using Jenkins or similar tools.
  • Automate testing, deployment, and monitoring to ensure smooth production releases.

Data Security & Compliance

  • Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
  • Implement data governance practices ensuring data integrity, security, and traceability.

Troubleshooting & Performance Tuning

  • Identify and resolve performance bottlenecks in data pipelines.
  • Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.

Collaboration & Cross-Functional Work

  • Work closely with engineers, data scientists, product managers, and business stakeholders.
  • Participate in agile ceremonies, sprint planning, and architectural discussions.


Skills & Qualifications

Mandatory (Must-Have) Skills

  1. AWS Expertise
  • Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
  • Strong understanding of cloud-native data architectures.
  1. Big Data Technologies
  • Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
  • Experience with Apache Spark and Apache Kafka in production environments.
  1. Data Frameworks
  • Strong knowledge of Spark DataFrames and Datasets.
  1. ETL Pipeline Development
  • Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
  1. Database Modeling & Data Warehousing
  • Expertise in designing scalable data models for OLAP and OLTP systems.
  1. Data Analysis & Insights
  • Ability to perform complex data analysis and extract actionable business insights.
  • Strong analytical and problem-solving skills with a data-driven mindset.
  1. CI/CD & Automation
  • Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
  • Familiarity with automated testing and deployment workflows.

 

Good-to-Have (Preferred) Skills

  • Knowledge of Java for data processing applications.
  • Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
  • Familiarity with data governance frameworks and compliance tooling.
  • Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
  • Exposure to cost optimization strategies for large-scale cloud data platforms.

 

Skills: big data, scala spark, apache spark, ETL pipeline development

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Hyderabad

Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer

F2F Interview: 14th Feb 2026

3 days in office, Hybrid model.

 


Read more
Bengaluru (Bangalore)
4 - 6 yrs
₹8L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconHTML/CSS
skill iconJavascript
SQL
+2 more


Key Responsibilities :


- Develop backend services using Node.js, including API orchestration and integration with AI/ML services.


- Implement frontend redaction features using Redact.js, integrated into React.js dashboards.


- Collaborate with AI/ML engineers to embed intelligent feedback and behavioral analysis.


- Build secure, multi-tenant systems with role-based access control (RLS).


- Optimize performance for real-time audio analysis and transcript synchronization.


- Participate in agile grooming sessions and contribute to architectural decisions.


Required Skills :


- Experience with React.js or similar annotation/redaction libraries.


- Strong understanding of RESTful APIs, React.js, and Material-UI.


- Familiarity with Azure services, SQL, and authentication protocols (SSO, JWT).


- Experience with secure session management and data protection standards.


Preferred Qualifications :


- Exposure to AI/ML workflows and Python-based services.


- Experience with Livekit or similar real-time communication platforms.


- Familiarity with Power BI and accessibility standards (WCGA).


Soft Skills :


- Problem-solving mindset and adaptability.


- Ability to work independently and meet tight deadlines.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Mumbai, Pune
3 - 6 yrs
Best in industry
skill iconPython
PySpark
pandas
SQL
ADF
+2 more

* Python (3 to 6 years): Strong expertise in data workflows and automation

* Spark (PySpark): Hands-on experience with large-scale data processing

* Pandas: For detailed data analysis and validation

* Delta Lake: Managing structured and semi-structured datasets at scale

* SQL: Querying and performing operations on Delta tables

* Azure Cloud: Compute and storage services

* Orchestrator: Good experience with either ADF or Airflow

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
9 - 12 yrs
₹53L - ₹70L / yr
skill iconJava
Microservices
CI/CD
MySQL
Scripting
+5 more

JOB DETAILS:

* Job Title: Engineering Manager

* Industry: Technology

* Salary: Best in Industry

* Experience: 9-12 years

* Location: Bengaluru

* Education: B.Tech in computer science or related field from Tier 1, Tier 2 colleges


Role & Responsibilities

We are seeking a visionary and decisive Engineering Manager to join our dynamic team. In this role, you will lead and inspire a talented team of software engineers, driving innovation and excellence in product development efforts. This is an exciting opportunity to influence and shape the future of our engineering organization.

 

Key Responsibilities-

As an Engineering Manager, you will be responsible for managing the overall software development life cycle of one product. You will work and manage a cross-functional team consisting of Backend Engineers, Frontend Engineers, QA, SDET, Product Managers, Product Designers, Technical Project Managers, Data Scientists, etc.

  • Responsible for mapping business objectives to an optimum engineering structure, including correct estimation of resource allocation.
  • Responsible for key technical and product decisions. Provide direction and mentorship to the team. Set up best practices for engineering.
  • Work closely with the Product Manager and help them in getting relevant inputs from the engineering team.
  • Plan and track the development and release schedules, proactively assess and mitigate risks. Prepare for contingencies and provide visible leadership in crisis.
  • Conduct regular 1:1s for performance feedback and lead their appraisals.
  • Responsible for driving good coding practices in the team like good quality code, documentation, timely bug fixing, etc.
  • Report on the status of development, quality, operations, and system performance to management.
  • Create and maintain an open and transparent environment that values speed and innovation and motivates engineers to build innovative and effective systems rapidly.


Ideal Candidate

  • Strong Engineering Manager / Technical Leadership Profile
  • Must have 9+ years of experience in software engineering with experience building complex, large-scale products
  • Must have 2+ years of experience as an Engineering Manager / Tech Lead with people management responsibilities
  • Strong technical foundation with hands-on experience in Java (or equivalent compiled language), scripting languages, web technologies, and databases (SQL/NoSQL)
  • Proven ability to solve large-scale technical problems and guide teams on architecture, design, quality, and best practices
  • Experience in leading cross-functional teams, planning and tracking delivery, mentoring engineers, conducting performance reviews, and driving engineering excellence
  • Must have strong experience working with Product Managers, UX designers, QA, and other cross-functional partners
  • Excellent communication and interpersonal skills to influence technical direction and stakeholder decisions
  • (Company): Product companies
  • Must have stayed for at least 2 years with each of the previous companies
  • (Education): B.Tech in computer science or related field from Tier 1, Tier 2 colleges
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Thiruvananthapuram, Trivandrum
5 - 9 yrs
₹13L - ₹25L / yr
skill icon.NET
skill iconJavascript
skill iconAngular (2+)
Windows Azure
SQL Azure
+13 more

Job Details

- Job Title: Specialist I - Software Engineering-.Net Fullstack Lead-TVM

Industry: Global digital transformation solutions provider

Domain - Information technology (IT)

Experience Required: 5-9 years

Employment Type: Full Time

Job Location: Trivandrum, Thiruvananthapuram

CTC Range: Best in Industry

 

Job Description

· Minimum 5+ years experienced senior/Lead .Net developer, including experience of the full development lifecycle, including post-live support.

· Significant experience delivering software using Agile iterative delivery methodologies.

· JIRA knowledge preferred.

· Excellent ability to understand requirement/story scope and visualise technical elements required for application solutions.

· Ability to clearly articulate complex problems and solutions in terms that others can understand.

· Lots of experience working with .Net backend API development.

· Significant experience of pipeline design, build and enhancement to support release cadence targets, including Infrastructure as Code (preferably Terraform).

· Strong understanding of HTML and CSS including cross-browser, compatibility, and performance.

· Excellent knowledge of unit and integration testing techniques.

· Azure knowledge (Web/Container Apps, Azure Functions, SQL Server).

· Kubernetes / Docker knowledge. Knowledge of JavaScript UI frameworks, ideally Vue Extensive experience with source control (preferably Git).

· Strong understanding of RESTful services (JSON) and API Design.

· Broad knowledge of Cloud infrastructure (PaaS, DBaaS).

· Experience of mentoring and coaching engineers operating within a co-located environment. 

 

Skills: .Net Fullstack, Azure Cloudformation, Javascript, Angular

 

Must-Haves:

.Net (5+ years), Agile methodologies, RESTful API design, Azure (Web/Container Apps, Functions, SQL Server), Git source control

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Trivandrum

F2F Weekend Interview on 14th Feb 2026

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
6 - 10 yrs
Best in industry
skill iconJava
skill iconJavascript
skill iconSpring Boot
Microservices
Hibernate (Java)
+6 more

Company Description

NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.


Role Description

This is a full-time hybrid role for a Java Software Engineer, based in Pune. The Java Software Engineer will be responsible for designing, developing, and maintaining software applications. Key responsibilities include working with microservices architecture, implementing and managing the Spring Framework, and programming in Java. Collaboration with cross-functional teams to define, design, and ship new features is also a key aspect of this role.


Responsibilities:

● Develop and Maintain: Write clean, efficient, and maintainable code for Java-based applications 

● Collaborate: Work with cross-functional teams to gather requirements and translate them into technical solutions 

● Code Reviews: Participate in code reviews to maintain high-quality standards 

● Troubleshooting: Debug and resolve application issues in a timely manner 

● Testing: Develop and execute unit and integration tests to ensure software reliability

● Optimize: Identify and address performance bottlenecks to enhance application performance 


Qualifications & Skills:

● Strong knowledge of Java, Spring Framework (Spring Boot, Spring MVC), and Hibernate/JPA 

● Familiarity with RESTful APIs and web services 

● Proficiency in working with relational databases like MySQL or PostgreSQL 

● Practical experience with AWS cloud services and building scalable, microservices-based architectures

● Experience with build tools like Maven or Gradle 

● Understanding of version control systems, especially Git 

● Strong understanding of object-oriented programming principles and design patterns 

● Familiarity with automated testing frameworks and methodologies 

● Excellent problem-solving skills and attention to detail 

● Strong communication skills and ability to work effectively in a collaborative team environment 


Why Join Us? 

● Opportunity to work on cutting-edge technology products 

● A collaborative and learning-driven environment 

● Exposure to AI and software engineering innovations 

● Excellent work ethic and culture 


If you're passionate about technology and want to work on impactful projects, we'd love to hear from you

Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Bengaluru (Bangalore)
2 - 3 yrs
Best in industry
Apache Airflow
azkaban
skill iconAmazon Web Services (AWS)
skill iconPython
Pipeline management
+7 more

Responsibilities:

  • Design, develop, and maintain efficient and reliable data pipelines.
  • Identify and implement process improvements, automating manual tasks and optimizing data delivery.
  • Build and maintain the infrastructure for data extraction, transformation, and loading (ETL) from diverse sources using SQL and AWS cloud technologies.
  • Develop data tools and solutions to empower our analytics and data science teams, contributing to product innovation.


Qualifications:

Must Have:

  • 2-3 years of experience in a Data Engineering role.
  • Familiarity with data pipeline and workflow management tools (e.g., Airflow, Luigi, Azkaban).
  • Experience with AWS cloud services.
  • Working knowledge of object-oriented/functional scripting in Python
  • Experience building and optimizing data pipelines and datasets.
  • Strong analytical skills and experience working with structured and unstructured data.
  • Understanding of data transformation, data structures, dimensional modeling, metadata management, schema evolution, and workload management.
  • A passion for building high-quality, scalable data solutions.


Good to have:

  • Experience with stream-processing systems (e.g., Spark Streaming, Flink).
  • Working knowledge of message queuing, stream processing, and scalable data stores.
  • Proficiency in SQL and experience with NoSQL databases like Elasticsearch and Cassandra/MongoDB.


Experience with big data tools such as HDFS/S3, Spark/Flink, Hive, HBase, Kafka/Kinesis.

Read more
Remote only
0 - 1 yrs
₹1L - ₹1.8L / yr
skill icon.NET
SQL
SQL server
skill iconjQuery
LINQ
+3 more

Position: .Net Core Intern (.Net Core Knowledge is must)

Education: BTech-Computer Science Only

Joining: Immediate Joiner

Work Mode: Remote

Working Days: Monday to Friday

Shift: Rotational – based on project need):

·      5:00 PM – 2:00 AM IST

·      6:00 PM – 3:00 AM IST

 

Job Summary

ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.

 

Eligibility & Qualifications

  • Education:
  • B.Tech (Computer Science) / M.Tech (Computer Science)
  • Tier 1 colleges preferred
  • Experience Level: Fresher
  • Communication: Excellent English communication skills (verbal & written)

Skills Required

1. Technical Skills (Must Have)

  • Experience with .NET Core (.NET 6 / 7 / 8)
  • Strong knowledge of C#, including:
  • Object-Oriented Programming (OOP) concepts
  • async/await
  • LINQ
  • ASP.NET Core (Web API / MVC)

2. Database Skills

  • SQL Server (preferred)
  • Writing complex SQL queries, joins, and subqueries
  • Stored Procedures, Functions, and Indexes
  • Database design and performance tuning
  • Entity Framework Core
  • Migrations and transaction handling

3. Frontend Skills (Required)

  • JavaScript (ES5 / ES6+)
  • jQuery
  • DOM manipulation
  • AJAX calls
  • Event handling
  • HTML5 & CSS3
  • Client-side form validation

4. Security & Performance

  • Data validation and exception handling
  • Caching concepts (In-memory / Redis – good to have)

5. Tools & Environment

  • Visual Studio / VS Code
  • Git (GitHub / Azure DevOps)
  • Basic knowledge of server deployment

6. Good to Have (Optional)

  • Azure or AWS deployment experience
  • CI/CD pipelines
  • Docker
  • Experience with data handling

 

Work Environment & Tools

  • Comfortable working in a remote setup
  • Familiarity with collaboration and remote access tools

 

Additional Requirements (Work-from-Home Setup)

This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:

  • System: Laptop or Desktop (Windows-based)
  • Operating System: Windows
  • Screen Size: Minimum 14 inches
  • Screen Resolution: Full HD (1920 × 1080)
  • Processor: Intel i5 or higher
  • RAM: Minimum 8 GB (Mandatory)
  • Software: AnyDesk
  • Internet Speed: 100 Mbps or higher

 

About ARDEM

 

ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.

 

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Delhi, Kolkata, Bengaluru (Bangalore), Kochi (Cochin), Hosur, Trivandrum
7 - 9 yrs
₹5.5L - ₹20L / yr
skill icon.NET
skill iconAmazon Web Services (AWS)
skill iconC#
skill iconReact.js
SQL

Job Description -

Profile: .Net Full Stack Lead

Experience Required: 7–12 Years

Location: Pune, Bangalore, Chennai, Coimbatore, Delhi, Hosur, Hyderabad, Kochi, Kolkata, Trivandrum

Work Mode: Hybrid

Shift: Normal Shift

Key Responsibilities:

  • Design, develop, and deploy scalable microservices using .NET Core and C#
  • Build and maintain serverless applications using AWS services (Lambda, SQS, SNS)
  • Develop RESTful APIs and integrate them with front-end applications
  • Work with both SQL and NoSQL databases to optimize data storage and retrieval
  • Implement Entity Framework for efficient database operations and ORM
  • Lead technical discussions and provide architectural guidance to the team
  • Write clean, maintainable, and testable code following best practices
  • Collaborate with cross-functional teams to deliver high-quality solutions
  • Participate in code reviews and mentor junior developers
  • Troubleshoot and resolve production issues in a timely manner

Required Skills & Qualifications:

  • 7–12 years of hands-on experience in .NET development
  • Strong proficiency in .NET Framework.NET Core, and C#
  • Proven expertise with AWS services (Lambda, SQS, SNS)
  • Solid understanding of SQL and NoSQL databases (SQL Server, MongoDB, DynamoDB, etc.)
  • Experience building and deploying Microservices architecture
  • Proficiency in Entity Framework or EF Core
  • Strong knowledge of RESTful API design and development
  • Experience with React or Angular is a good to have
  • Understanding of CI/CD pipelines and DevOps practices
  • Strong debugging, performance optimization, and problem-solving skills
  • Experience with design patterns, SOLID principles, and best coding practices
  • Excellent communication and team leadership skills


Read more
Truetech solutions

Truetech solutions

Agency job
via TrueTech Solutions by Meimozhi balu
Bengaluru (Bangalore), Kochi (Cochin)
4 - 15 yrs
₹10L - ₹25L / yr
skill icon.NET
ASP.NET
skill iconAmazon Web Services (AWS)
Amazon EC2
AWS Lambda
+2 more

• Minimum 4+ years of years

• Experience in designing, developing, and maintain backend services using C# 12 and .NET 8 or .NET 9

• Experience in building and operating cloud native and serverless applications on AWS

• Experience in developing and integrating services using AWS lambda, API Gateway , dynamo DB, Eventbridge, CloudWatch, SQS, SNS, Kinesis, Secret Manager, S3 storage, server architectural models etc.

Experience in integrating services using AWS SDK

• Should be cognizant of the OMS paradigms including Inventory Management, Inventory publish, supply feed processing, control mechanisms, ATP publish, Order Orchestration, workflow set up and customizations, integrations with tax, AVS, payment engines, sourcing algorithms and managing reservations with back orders, schedule mechanisms, flash sales management etc.

• Should have a decent End to End knowledge of various Commerce subsystems which include Storefront, Core Commerce back end, Post Purchase processing, OMS, Store / Warehouse Management processes, Supply Chain and Logistic processes. This is to ascertain candidates knowhow on the overall Retail landscape of any customer.

• Strong knowledge in Querying in Oracle DB and SQL Server

• Able to read, write and manage PLSQL procedures in oracle.

• Strong debugging, performance tuning and problem solving skills

• Experience with event driven and micro services architectures

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
9 - 12 yrs
₹50L - ₹70L / yr
skill iconJava
Microservices
CI/CD
MySQL
MySQL DBA
+9 more

Job Details

- Job Title: Staff Engineer

Industry: Technology

Domain - Information technology (IT)

Experience Required: 9-12 years

Employment Type: Full Time

Job Location: Bengaluru

CTC Range: Best in Industry

 

Role & Responsibilities

As a Staff Engineer at company, you will play a critical role in defining and driving our backend architecture as we scale globally. You’ll own key systems that handle high volumes of data and transactions, ensuring performance, reliability, and maintainability across distributed environments.

 

Key Responsibilities-

  • Own one or more core applications end-to-end, ensuring reliability, performance, and scalability.
  • Lead the design, architecture, and development of complex, distributed systems, frameworks, and libraries aligned with company’s technical strategy.
  • Drive engineering operational excellence by defining robust roadmaps for system reliability, observability, and performance improvements.
  • Analyze and optimize existing systems for latency, throughput, and efficiency, ensuring they perform at scale.
  • Collaborate cross-functionally with Product, Data, and Infrastructure teams to translate business requirements into technical deliverables.
  • Mentor and guide engineers, fostering a culture of technical excellence, ownership, and continuous learning.
  • Establish and uphold coding standards, conduct design and code reviews, and promote best practices across teams.
  • Stay ahead of the curve on emerging technologies, frameworks, and patterns to strengthen company’s technology foundation.
  • Contribute to hiring by identifying and attracting top-tier engineering talent.

 

Ideal Candidate

  • Strong staff engineer profile
  • Must have 9+ years in backend engineering with Java, Spring/Spring Boot, and microservices building large and schalable systems
  • Must have been SDE-3 / Tech Lead / Lead SE for at least 2.5 years
  • Strong in DSA, system design, design patterns, and problem-solving
  • Proven experience building scalable, reliable, high-performance distributed systems
  • Hands-on with SQL/NoSQL databases, REST/gRPC APIs, concurrency & async processing
  • Experience in AWS/GCP, CI/CD pipelines, and observability/monitoring
  • Excellent ability to explain complex technical concepts to varied stakeholders
  • Product companies (B2B SAAS preferred)
  • Must have stayed for at least 2 years with each of the previous companies
  • (Education): B.Tech in computer science from Tier 1, Tier 2 colleges


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Mumbai
2 - 6 yrs
₹2L - ₹8L / yr
Linux/Unix
Linux administration
Apache
Apache Tomcat
JBoss
+6 more

Job Title : System Support Engineer – L1

Experience : 2.5+ Years

Location : Mumbai (Powai)

Shift : Rotational


Role Summary :

Provide first-level technical and functional support for enterprise applications and infrastructure. Handle user issues, troubleshoot systems, and ensure timely resolution while following support processes.


Key Responsibilities :

  • Provide phone/email support and own user issues end-to-end.
  • Log, track, and update tickets in Jira/Freshdesk.
  • Troubleshoot Linux/UNIX systems, web servers, and databases.
  • Escalate unresolved issues and communicate during downtimes.
  • Create knowledge base articles and support documentation.


Mandatory Skills :

Linux/UNIX administration, Apache/Tomcat/JBoss, basic SQL databases (MySQL/SQL Server/Oracle), scripting knowledge, and ticketing tools experience.


Preferred :

  • Banking/Financial Services domain exposure and client-site support experience.
  • Strong communication skills, customer-focused mindset, and willingness to work in rotational shifts are essential.
Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Kochi (Cochin), Chennai, Thiruvananthapuram
5 - 7 yrs
₹19L - ₹28L / yr
skill iconJava
skill iconSpring Boot
Microservices
Architecture
Google Cloud Platform (GCP)
+22 more

Job Details

- Job Title: Lead I - Software Engineering - Java, Spring Boot, Microservices

- Industry: Global digital transformation solutions provider

- Domain - Information technology (IT)

- Experience Required: 5-7 years

- Employment Type: Full Time

- Job Location: Trivandrum, Chennai, Kochi, Thiruvananthapuram

- CTC Range: Best in Industry

 

Job Description

Job Title: Senior Java Developer Experience: 5+ years

Job Summary:

We are looking for a Senior Java Developer with strong experience in Spring Boot and Microservices to work on high-performance applications for a leading financial services client. The ideal candidate will have deep expertise in Java backend development, cloud (preferably GCP), and strong problem-solving abilities.

 

Key Responsibilities:

• Develop and maintain Java-based microservices using Spring Boot

• Collaborate with Product Owners and teams to gather and review requirements

• Participate in design reviews, code reviews, and unit testing

• Ensure application performance, scalability, and security

• Contribute to solution architecture and design documentation

• Support Agile development processes including daily stand-ups and sprint planning

• Mentor junior developers and lead small modules or features

 

Required Skills:

• Java, Spring Boot, Microservices architecture

• GCP (or other cloud platforms like AWS)

• REST/SOAP APIs, Hibernate, SQL, Tomcat

• CI/CD tools: Jenkins, Bitbucket

• Agile methodologies (Scrum/Kanban)

• Unit testing (JUnit), debugging and troubleshooting

• Good communication and team leadership skills

 

Preferred Skills:

• Frontend familiarity (Angular, AJAX)

• Experience with API documentation tools (Swagger)

• Understanding of design patterns and UML

• Exposure to Confluence, Jira

 

Mandatory Skills Required:

Strong proficiency in Java, spring boot, Microservices, GCP/AWS.

Experience Required: Minimum 5+ years of relevant experience

Java/J2EE (5+ years), Spring/Spring Boot (5+ years), Microservices (5+ years), AWS/GCP/Azure (mandatory), CI/CD (Jenkins, SonarQube, Git)

Java, Spring Boot, Microservices architecture

GCP (or other cloud platforms like AWS)

REST/SOAP APIs, Hibernate, SQL, Tomcat

CI/CD tools: Jenkins, Bitbucket

Agile methodologies (Scrum/Kanban)

Unit testing (JUnit), debugging and troubleshooting

Good communication and team leadership skills

 

******

Notice period - 0 to 15 days only (Immediate and who can join by Feb)

Job stability is mandatory

Location: Trivandrum, Kochi, Chennai

Virtual Interview - 14th Feb 2026

Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Hyderabad
4 - 6 yrs
Best in industry
skill iconPHP
skill iconLaravel
Object Oriented Programming (OOPs)
MVC Framework
Design patterns
+4 more

We are looking for a Staff Engineer - PHP to join one of our engineering teams at our office in Hyderabad.


What would you do?

  • Design, build, and maintain backend systems and APIs from requirements to production.
  • Own feature development, bug fixes, and performance optimizations.
  • Ensure code quality, security, testing, and production readiness.
  • Collaborate with frontend, product, and QA teams for smooth delivery.
  • Diagnose and resolve production issues and drive long-term fixes.
  • Contribute to technical discussions and continuously improve engineering practices.


Who Should Apply?

  • 4–6 years of hands-on experience in backend development using PHP.
  • Strong proficiency with Laravel or similar PHP frameworks, following OOP, MVC, and design patterns.
  • Solid experience in RESTful API development and third-party integrations.
  • Strong understanding of SQL databases (MySQL/PostgreSQL); NoSQL exposure is a plus.
  • Comfortable with Git-based workflows and collaborative development.
  • Working knowledge of HTML, CSS, and JavaScript fundamentals.
  • Experience with performance optimization, security best practices, and debugging.
  • Nice to have: exposure to Docker, CI/CD pipelines, cloud platforms, and automated testing.


Read more
suntekai
Kushi A
Posted by Kushi A
Remote only
0 - 1 yrs
₹10000 - ₹12000 / mo
skill iconPython
skill iconPostgreSQL
Data Visualization
Business Intelligence (BI)
SQL
+2 more

Job Description: Data Analyst


About the Role

We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..

Key Responsibilities

  • Data Extraction & Management

  • Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.

  • Ensure accuracy, reliability, and consistency of data across different platforms.

  • Data Analysis & Insights

  • Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.

  • Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.

  • Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.

  • Business Intelligence & Visualization

  • Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).

  • Create visualizations that simplify complex datasets for stakeholders and management.

  • Python (Pandas)

  • Use Python (Pandas, NumPy) for advanced analytics.

  • Collaboration & Stakeholder Management

  • Work closely with product, operations, and leadership teams to provide insights that drive decision-making.

  • Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.

Required Skills

  • SQL/PostgreSQL

  • Complex joins, window functions, CTEs, aggregations, query optimization.

  • Python (Pandas & Analytics)

  • Data wrangling, cleaning, transformations, exploratory data analysis (EDA).

  • Libraries: Pandas, NumPy, Matplotlib, Seaborn

  • Data Visualization & BI Tools

  • Expertise in creating dashboards and reports using Metabase or Looker.

  • Ability to translate raw data into meaningful visual insights.

  • Business Intelligence

  • Strong analytical reasoning to connect data insights with e-commerce KPIs.

  • Experience in funnel analysis, customer journey mapping, and retention analysis.

  • Analytics & E-commerce Knowledge

  • Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.

  • General Skills

  • Strong communication and presentation skills.

  • Ability to work cross-functionally in fast-paced environments.

  • Problem-solving mindset with attention to detail.



Education: Bachelor’s degree in Data Science, Computer Science, data processing




Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
2 - 5 yrs
₹4L - ₹5L / yr
DevOps
Windows Azure
CI/CD
MySQL
skill iconPython
+12 more

JOB DETAILS:

* Job Title: DevOps Engineer (Azure)

* Industry: Technology

* Salary: Best in Industry

* Experience: 2-5 years

* Location: Bengaluru, Koramangala

Review Criteria

  • Strong Azure DevOps Engineer Profiles.
  • Must have minimum 2+ years of hands-on experience as an Azure DevOps Engineer with strong exposure to Azure DevOps Services (Repos, Pipelines, Boards, Artifacts).
  • Must have strong experience in designing and maintaining YAML-based CI/CD pipelines, including end-to-end automation of build, test, and deployment workflows.
  • Must have hands-on scripting and automation experience using Bash, Python, and/or PowerShell
  • Must have working knowledge of databases such as Microsoft SQL Server, PostgreSQL, or Oracle Database
  • Must have experience with monitoring, alerting, and incident management using tools like Grafana, Prometheus, Datadog, or CloudWatch, including troubleshooting and root cause analysis

 

Preferred

  • Knowledge of containerisation and orchestration tools such as Docker and Kubernetes.
  • Knowledge of Infrastructure as Code and configuration management tools such as Terraform and Ansible.
  • Preferred (Education) – BE/BTech / ME/MTech in Computer Science or related discipline

 

Role & Responsibilities

  • Build and maintain Azure DevOps YAML-based CI/CD pipelines for build, test, and deployments.
  • Manage Azure DevOps Repos, Pipelines, Boards, and Artifacts.
  • Implement Git branching strategies and automate release workflows.
  • Develop scripts using Bash, Python, or PowerShell for DevOps automation.
  • Monitor systems using Grafana, Prometheus, Datadog, or CloudWatch and handle incidents.
  • Collaborate with dev and QA teams in an Agile/Scrum environment.
  • Maintain documentation, runbooks, and participate in root cause analysis.

 

Ideal Candidate

  • 2–5 years of experience as an Azure DevOps Engineer.
  • Strong hands-on experience with Azure DevOps CI/CD (YAML) and Git.
  • Experience with Microsoft Azure (OCI/AWS exposure is a plus).
  • Working knowledge of SQL Server, PostgreSQL, or Oracle.
  • Good scripting, troubleshooting, and communication skills.
  • Bonus: Docker, Kubernetes, Terraform, Ansible experience.
  • Comfortable with WFO (Koramangala, Bangalore).


Read more
Reliable Group

at Reliable Group

2 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
10yrs+
Upto ₹42L / yr (Varies
)
skill icon.NET
.NET Compact Framework
SQL
Windows Azure
CI/CD
+5 more

Application Architect – .NET

Role Overview

We are looking for a senior, hands-on Application Architect with deep .NET experience who can fix and modernize our current systems and build a strong engineering team over time.

Important – This role hands-on with architectural mindset. This person should be comfortable working with legacy systems and can make and explain tradeoffs.


Key Responsibilities

Application Architecture & Modernization

  • Own application architecture across legacy .NET Framework and modern .NET systems
  • Review the existing application, and drive an incremental modernization approach along with new feature development as per business growth of the company.
  • Own the gradual move away from outdated patterns (Web Forms, tightly coupled MVC, legacy UI constructs)
  • Define clean API contracts between front-end and backend services
  • Identify and resolve performance bottlenecks across code and database layers
  • Improve data access patterns, caching strategies, and system responsiveness
  • Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.


Backend, APIs & Integrations

  • Design scalable backend services and APIs
  • Improve how newer .NET services interact with legacy systems
  • Lead integrations with external systems, including Zoho
  • Prior experience integrating with Zoho (CRM, Finance, or other modules) is a strong value add
  • Experience designing and implementing integrations using EDI standards


Data & Schema Design

  • Review existing database schemas and core data structures
  • Redesign data models to support growth, and reporting/analytics requirements
  • Optimize SǪL queries to reduce the load on execution and DB engine


Cloud Awareness

  • Design applications with cloud deployment in mind (primarily Azure)
  • Understand how to use Azure services to improve security, scalability, and availability
  • Work with Cloud and DevOps teams to ensure application architecture aligns with cloud best practices
  • Push for CI/CD automation so that team pushes code regularly and makes progress.


Team Leadership & Best Practices

  • Act as a technical leader and mentor for the engineering team
  • Help hire, onboard, and grow a team under this role over time.
  • Define KPIs and engineering best practices (including focus on documentation)
  • Set coding standards, architectural guidelines, and review practices
  • Improve testability and long-term health of the codebase
  • Raise the overall engineering bar through reviews, coaching, and clear standards
  • Create a culture of ownership and quality


Cross-Platform Thinking

  • Strong communicator who can convert complex tech topics into business-friendly lingo. Understands the business needs and importance of user experience
  • While .NET is the core stack, contribute to architecture decisions across platforms
  • Leverages AI tools to accelerate design, coding, reviews, and troubleshooting while maintaining high quality


Skills and Experience

  • 12+ years of hands-on experience in application development (preferably on .NET stack)
  • Experience leading technical direction while remaining hands-on
  • Deep expertise in .NET Framework (4.x) and modern .NET (.NET Core / .NET 6+)
  • Must have lead a project to modernize legacy system – preferably moving from .NET Framework to .NET Core.
  • Experience with MVC, Web Forms, and legacy UI patterns
  • Solid backend and API design experience
  • Strong understanding of database design and schema evolution
  • Understanding of Analytical systems – OLAP, Data warehousing, data lakes.
  • Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.
  • Integration with Zoho would be a plus.
Read more
Cansvolution
Pooja Rawat
Posted by Pooja Rawat
Indore
2 - 5 yrs
₹5L - ₹12L / yr
skill icon.NET
skill iconAngular (2+)
skill iconReact.js
ASP.NET
SQL
+4 more

About Cansvolution

Cansvolution is a growing IT services and product-based company based in Indore, M.P. We work with clients across industries, delivering scalable web and digital solutions. Our team focuses on innovation, practical problem-solving, and building technology that creates real business impact. We offer a collaborative work culture, hands-on learning, and strong growth opportunities for our employees.


Position: .NET Developer

Experience Required: Minimum 2+ Years

Location: Indore (Work From Office)

Joining: Immediate joiners preferred


Key Responsibilities

Design, develop, and maintain web applications using .NET technologies

Work on front-end development using React JS or Angular

Build and consume RESTful APIs

Collaborate with cross-functional teams including designers and backend developers

Debug, troubleshoot, and improve application performance

Participate in code reviews and follow best development practices


Required Skills

Strong experience in ASP.NET / .NET Core

Hands-on expertise in React JS or Angular

Good understanding of HTML, CSS, JavaScript

Experience with SQL databases

Knowledge of API integration

Understanding of software development lifecycle.


Preferred Skills

Experience working in Agile environments

Knowledge of version control tools like Git

Strong analytical and problem-solving abilities

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Pune
3 - 8 yrs
₹12L - ₹25L / yr
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Software Testing (QA)
+9 more

Job Title : QA Lead (AI/ML Products)

Employment Type : Full Time

Experience : 4 to 8 Years

Location : On-site

Mandatory Skills : Strong hands-on experience in testing AI/ML (LLM, RAG) applications with deep expertise in API testing, SQL/NoSQL database validation, and advanced backend functional testing.


Role Overview :

We are looking for an experienced QA Lead who can own end-to-end quality for AI-influenced products and backend-heavy systems. This role requires strong expertise in advanced functional testing, API validation, database verification, and AI model behavior testing in non-deterministic environments.


Key Responsibilities :

  • Define and implement comprehensive test strategies aligned with business and regulatory goals.
  • Validate AI/ML and LLM-driven applications, including RAG pipelines, hallucination checks, prompt injection scenarios, and model response validation.
  • Perform deep API testing using Postman/cURL and validate JSON/XML payloads.
  • Execute complex SQL queries (MySQL/PostgreSQL) and work with MongoDB for backend and data integrity validation.
  • Analyze server logs and transactional flows to debug issues and ensure system reliability.
  • Conduct risk analysis and report key QA metrics such as defect leakage and release readiness.
  • Establish and refine QA processes, templates, standards, and agile testing practices.
  • Identify performance bottlenecks and basic security vulnerabilities (e.g., IDOR, data exposure).
  • Collaborate closely with developers, product managers, and domain experts to translate business requirements into testable scenarios.
  • Own feature quality independently from conception to release.

Required Skills & Experience :

  • 4+ years of hands-on experience in software testing and QA.
  • Strong understanding of testing AI/ML products, LLM validation, and non-deterministic behavior testing.
  • Expertise in API Testing, server log analysis, and backend validation.
  • Proficiency in SQL (MySQL/PostgreSQL) and MongoDB.
  • Deep knowledge of SDLC and Bug Life Cycle.
  • Strong problem-solving ability and structured approach to ambiguous scenarios.
  • Awareness of performance testing and basic security testing practices.
  • Excellent communication skills to articulate defects and QA strategies.

What We’re Looking For :

A proactive QA professional who can go beyond UI testing, understands backend systems deeply, and can confidently test modern AI-driven applications while driving quality standards across the team.

Read more
Auxo AI
Kritika Dhingra
Posted by Kritika Dhingra
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
2 - 8 yrs
₹10L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Data Transformation Tool (DBT)
SQL
skill iconPython
Spark
+1 more

AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.


Location : Bangalore, Hyderabad, Mumbai, and Gurgaon


Responsibilities:

· Designing, building, and operating scalable on-premises or cloud data architecture

· Analyzing business requirements and translating them into technical specifications

· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)

· Design, develop, and maintain scalable data pipelines and ETL processes

· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.

· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness

· Implement data governance and security best practices to ensure compliance and data integrity

· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring

· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.


Requirements


· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines

· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks

· Comprehensive understanding of the Snowflake and Databricks ecosystem

· Strong programming skills in languages like SQL and Python or PySpark.

· Experience with data modeling, ETL processes, and data warehousing concepts.

· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.


Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 8 yrs
₹26L - ₹35L / yr
skill iconPython
skill iconJava
SQL
FastAPI
skill iconDjango
+5 more

Review Criteria

  • Strong Senior Backend Engineer profiles
  • Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
  • Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
  • Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
  • Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
  • Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
  • Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
  • (Company) – Must have worked in product companies / startups, preferably Series A to Series D
  • (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred

 

Role & Responsibilities

As a Founding Engineer at company, you'll join our engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.

This role is perfect for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems need creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.

 

Key Responsibilities-

  • Build core platform features: Develop robust APIs, services, and integrations that power company’s billing automation and revenue recognition capabilities
  • Work across the full stack: Contribute to both backend services and frontend interfaces, ensuring seamless user experiences
  • Implement critical integrations: Connect company with external systems including CRMs, data warehouses, ERPs, and payment processors
  • Optimize for scale: Build systems that handle complex pricing models, high-volume usage data, and real-time financial calculations
  • Drive quality and best practices: Write clean, maintainable code while participating in code reviews and architectural discussions
  • Solve complex problems: Debug issues across the stack and work closely with teams to address evolving client needs

 

The Impact You'll Make-

  • Power business growth: Your code will directly enable billing and revenue operations for fast-growing B2B companies, helping them scale without operational bottlenecks
  • Build critical financial infrastructure: Contribute to systems handling millions in transactions while ensuring accurate, compliant revenue recognition
  • Shape product direction: Join during our scaling phase where your contributions immediately impact product evolution and customer success
  • Accelerate your expertise: Gain deep knowledge in financial systems, B2B SaaS operations, and enterprise software while working with industry veterans
  • Drive the future of B2B commerce: Help create infrastructure powering next-generation pricing models from usage-based to value-based billing.

 

 

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Thiruvananthapuram
5 - 7 yrs
₹18L - ₹26L / yr
skill iconKotlin
skill iconJava
skill iconAmazon Web Services (AWS)
skill iconSpring Boot
Microservices
+24 more

JOB DETAILS:

* Job Title: Lead I - Software Engineering-Kotlin, Java, Spring Boot, Aws

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 5 -7 years

* Location: Trivandrum, Thiruvananthapuram

Role Proficiency:

Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities

 

Skill Examples:

  1. Explain and communicate the design / development to the customer
  2. Perform and evaluate test results against product specifications
  3. Break down complex problems into logical components
  4. Develop user interfaces business software components
  5. Use data models
  6. Estimate time and effort required for developing / debugging features / components
  7. Perform and evaluate test in the customer or target environment
  8. Make quick decisions on technical/project related challenges
  9. Manage a Team mentor and handle people related issues in team
  10.  Maintain high motivation levels and positive dynamics in the team.
  11.  Interface with other teams’ designers and other parallel practices
  12.  Set goals for self and team. Provide feedback to team members
  13.  Create and articulate impactful technical presentations
  14.  Follow high level of business etiquette in emails and other business communication
  15.  Drive conference calls with customers addressing customer questions
  16.   Proactively ask for and offer help
  17.  Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks.
  18.  Build confidence with customers by meeting the deliverables on time with quality.
  19.  Estimate time and effort resources required for developing / debugging features / components
  20.  Make on appropriate utilization of Software / Hardware’s.
  21.  Strong analytical and problem-solving abilities

 

Knowledge Examples:

  •     Appropriate software programs / modules
  1. Functional and technical designing
  2. Programming languages – proficient in multiple skill clusters
  3. DBMS
  4. Operating Systems and software platforms
  5. Software Development Life Cycle
  6. Agile – Scrum or Kanban Methods
  7. Integrated development environment (IDE)
  8. Rapid application development (RAD)
  9. Modelling technology and languages
  10. Interface definition languages (IDL)
  11. Knowledge of customer domain and deep understanding of sub domain where problem is solved

 

Additional Comments:

We are seeking an experienced Senior Backend Engineer with strong expertise in Kotlin and Java to join our dynamic engineering team.

The ideal candidate will have a deep understanding of backend frameworks, cloud technologies, and scalable microservices architectures, with a passion for clean code, resilience, and system observability.

You will play a critical role in designing, developing, and maintaining core backend services that power our high-availability e-commerce and promotion platforms.

 

Key Responsibilities

Design, develop, and maintain backend services using Kotlin (JVM, Coroutines, Serialization) and Java.

Build robust microservices with Spring Boot and related Spring ecosystem components (Spring Cloud, Spring Security, Spring Kafka, Spring Data).

Implement efficient serialization/deserialization using Jackson and Kotlin Serialization. Develop, maintain, and execute automated tests using JUnit 5, Mockk, and ArchUnit to ensure code quality.

Work with Kafka Streams (Avro), Oracle SQL (JDBC, JPA), DynamoDB, and Redis for data storage and caching needs. Deploy and manage services in AWS environment leveraging DynamoDB, Lambdas, and IAM.

Implement CI/CD pipelines with GitLab CI to automate build, test, and deployment processes.

Containerize applications using Docker and integrate monitoring using Datadog for tracing, metrics, and dashboards.

Define and maintain infrastructure as code using Terraform for services including GitLab, Datadog, Kafka, and Optimizely.

Develop and maintain RESTful APIs with OpenAPI (Swagger) and JSON API standards.

Apply resilience patterns using Resilience4j to build fault-tolerant systems.

Adhere to architectural and design principles such as Domain-Driven Design (DDD), Object-Oriented Programming (OOP), and Contract Testing (Pact).

Collaborate with cross-functional teams in an Agile Scrum environment to deliver high-quality features.

Utilize feature flagging tools like Optimizely to enable controlled rollouts.

 

Mandatory Skills & Technologies Languages:

Kotlin (JVM, Coroutines, Serialization),

Java Frameworks: Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data)

Serialization: Jackson, Kotlin Serialization

Testing: JUnit 5, Mockk, ArchUnit

Data: Kafka (Avro) Streams Oracle SQL (JDBC, JPA) DynamoDB (NoSQL) Redis (Caching)

Cloud: AWS (DynamoDB, Lambda, IAM)

CI/CD: GitLab CI Containers: Docker

Monitoring & Observability: Datadog (Tracing, Metrics, Dashboards, Monitors)

Infrastructure as Code: Terraform (GitLab, Datadog, Kafka, Optimizely)

API: OpenAPI (Swagger), REST API, JSON API

Resilience: Resilience4j

Architecture & Practices: Domain-Driven Design (DDD) Object-Oriented Programming (OOP) Contract Testing (Pact) Feature Flags (Optimizely)

Platforms: E-Commerce Platform (CommerceTools), Promotion Engine (Talon.One)

Methodologies: Scrum, Agile

 

Skills: Kotlin, Java, Spring Boot, Aws

 

Must-Haves

Kotlin (JVM, Coroutines, Serialization), Java, Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data), AWS (DynamoDB, Lambda, IAM), Microservices Architecture

 

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Trivandrum

Virtual Weekend Interview on 7th Feb 2026 - Saturday

Read more
Remote only
9 - 12 yrs
₹2L - ₹2.5L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Terraform
Data Transformation Tool (DBT)
SQL
+1 more

🚀 Hiring: Associate Tech Architect / Senior Tech Specialist

🌍 Remote | Contract Opportunity

We’re looking for a seasoned tech professional who can lead the design and implementation of cloud-native data and platform solutions. This is a remote, contract-based role for someone with strong ownership and architecture experience.

🔴 Mandatory & Most Important Skill Set

Hands-on expertise in the following technologies is essential:

AWS – Cloud architecture & services

Python – Backend & data engineering

Terraform – Infrastructure as Code

Airflow – Workflow orchestration

SQL – Data processing & querying

DBT – Data transformation & modeling

💼 Key Responsibilities

  • Architect and build scalable AWS-based data platforms
  • Design and manage ETL/ELT pipelines
  • Orchestrate workflows using Airflow
  • Implement cloud infrastructure using Terraform
  • Lead best practices in data architecture, performance, and scalability
  • Collaborate with engineering teams and provide technical leadership

🎯 Ideal Profile

✔ Strong experience in cloud and data platform architecture

✔ Ability to take end-to-end technical ownership

✔ Comfortable working in a remote, distributed team environment

📄 Role Type: Contract

🌍 Work Mode: 100% Remote

If you have deep expertise in these core technologies and are ready to take on a high-impact architecture role, we’d love to hear from you.


Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Kochi (Cochin)
4 - 6 yrs
₹11L - ₹17L / yr
Windows Azure
skill iconPython
SQL Azure
databricks
PySpark
+15 more

JOB DETAILS:

* Job Title: Associate III - Azure Data Engineer 

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 4 -6 years

* Location: Trivandrum, Kochi

Job Description: Azure Data Engineer (4–6 Years Experience)

Job Type: Full-time 

Locations: Kochi, Trivandrum

 

Must-Have Skills

Azure & Data Engineering

  • Azure Data Factory (ADF)
  • Azure Databricks (PySpark)
  • Azure Synapse Analytics
  • Azure Data Lake Storage Gen2
  • Azure SQL Database

 

Programming & Querying

  • Python (PySpark)
  • SQL / Spark SQL

 

Data Modelling

  • Star & Snowflake schema
  • Dimensional modelling

 

Source Systems

  • SQL Server
  • Oracle
  • SAP
  • REST APIs
  • Flat files (CSV, JSON, XML)

 

CI/CD & Version Control

  • Git
  • Azure DevOps / GitHub Actions

 

Monitoring & Scheduling

  • ADF triggers
  • Databricks jobs
  • Log Analytics

 

Security

  • Managed Identity
  • Azure Key Vault
  • Azure RBAC / Access Control

 

Soft Skills

  • Strong analytical & problem-solving skills
  • Good communication and collaboration
  • Ability to work in Agile/Scrum environments
  • Self-driven and proactive

 

Good-to-Have Skills

  • Power BI basics
  • Delta Live Tables
  • Synapse Pipelines
  • Real-time processing (Event Hub / Stream Analytics)
  • Infrastructure as Code (Terraform / ARM templates)
  • Data governance tools like Azure Purview
  • Azure Data Engineer Associate (DP-203) certification

 

Educational Qualifications

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.

 

Skills: Azure Data Factory, Azure Databricks, Azure Synapse, Azure Data Lake Storage

 

Must-Haves

Azure Data Factory (4-6 years), Azure Databricks/PySpark (4-6 years), Azure Synapse Analytics (4-6 years), SQL/Spark SQL (4-6 years), Git/Azure DevOps (4-6 years)

Skills: Azure, Azure data factory, Python, Pyspark, Sql, Rest Api, Azure Devops

Relevant 4 - 6 Years

python is mandatory

 

******

Notice period - 0 to 15 days only (Feb joiners’ profiles only)

Location: Kochi

F2F Interview 7th Feb

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Kochi (Cochin), Trivandrum
4 - 6 yrs
₹11L - ₹17L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Data engineering
SQL
ETL
+22 more

JOB DETAILS:

* Job Title: Associate III - Data Engineering

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 4-6 years

* Location: Trivandrum, Kochi

Job Description

Job Title:

Data Services Engineer – AWS & Snowflake

 

Job Summary:

As a Data Services Engineer, you will be responsible for designing, developing, and maintaining robust data solutions using AWS cloud services and Snowflake.

You will work closely with cross-functional teams to ensure data is accessible, secure, and optimized for performance.

Your role will involve implementing scalable data pipelines, managing data integration, and supporting analytics initiatives.

 

Responsibilities:

• Design and implement scalable and secure data pipelines on AWS and Snowflake (Star/Snowflake schema)

• Optimize query performance using clustering keys, materialized views, and caching

• Develop and maintain Snowflake data warehouses and data marts.

• Build and maintain ETL/ELT workflows using Snowflake-native features (Snowpipe, Streams, Tasks).

• Integrate Snowflake with cloud platforms (AWS, Azure, GCP) and third-party tools (Airflow, dbt, Informatica)

• Utilize Snowpark and Python/Java for complex transformations

• Implement RBAC, data masking, and row-level security.

• Optimize data storage and retrieval for performance and cost-efficiency.

• Collaborate with stakeholders to gather data requirements and deliver solutions.

• Ensure data quality, governance, and compliance with industry standards.

• Monitor, troubleshoot, and resolve data pipeline and performance issues.

• Document data architecture, processes, and best practices.

• Support data migration and integration from various sources.

 

Qualifications:

• Bachelor’s degree in Computer Science, Information Technology, or a related field.

• 3 to 4 years of hands-on experience in data engineering or data services.

• Proven experience with AWS data services (e.g., S3, Glue, Redshift, Lambda).

• Strong expertise in Snowflake architecture, development, and optimization.

• Proficiency in SQL and Python for data manipulation and scripting.

• Solid understanding of ETL/ELT processes and data modeling.

• Experience with data integration tools and orchestration frameworks.

• Excellent analytical, problem-solving, and communication skills.

 

Preferred Skills:

• AWS Glue, AWS Lambda, Amazon Redshift

• Snowflake Data Warehouse

• SQL & Python

 

Skills: Aws Lambda, AWS Glue, Amazon Redshift, Snowflake Data Warehouse

 

Must-Haves

AWS data services (4-6 years), Snowflake architecture (4-6 years), SQL (proficient), Python (proficient), ETL/ELT processes (solid understanding)

Skills: AWS, AWS lambda, Snowflake, Data engineering, Snowpipe, Data integration tools, orchestration framework

Relevant 4 - 6 Years

python is mandatory

 

******

Notice period - 0 to 15 days only (Feb joiners’ profiles only)

Location: Kochi

F2F Interview 7th Feb

 

 

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
6 - 9 yrs
₹13L - ₹22L / yr
Web API
skill iconC#
skill icon.NET
skill iconAmazon Web Services (AWS)
Agile/Scrum
+19 more

JOB DETAILS:

* Job Title: Lead I - (Web Api, C# .Net, .Net Core, Aws (Mandatory)

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 6 -9 years

* Location: Hyderabad

Job Description

Role Overview

We are looking for a highly skilled Senior .NET Developer who has strong experience in building scalable, high‑performance backend services using .NET Core and C#, with hands‑on expertise in AWS cloud services. The ideal candidate should be capable of working in an Agile environment, collaborating with cross‑functional teams, and contributing to both design and development. Experience with React and Datadog monitoring tools will be an added advantage.

 

Key Responsibilities

  • Design, develop, and maintain backend services and APIs using .NET Core and C#.
  • Work with AWS services (Lambda, S3, ECS/EKS, API Gateway, RDS, etc.) to build cloud‑native applications.
  • Collaborate with architects and senior engineers on solution design and implementation.
  • Write clean, scalable, and well‑documented code.
  • Use Postman to build and test RESTful APIs.
  • Participate in code reviews and provide technical guidance to junior developers.
  • Troubleshoot and optimize application performance.
  • Work closely with QA, DevOps, and Product teams in an Agile setup.
  • (Optional) Contribute to frontend development using React.
  • (Optional) Use Datadog for monitoring, logging, and performance metrics.

 

Required Skills & Experience

  • 6+ years of experience in backend development.
  • Strong proficiency in C# and .NET Core.
  • Experience building RESTful services and microservices.
  • Hands‑on experience with AWS cloud platform.
  • Solid understanding of API testing using Postman.
  • Knowledge of relational databases (SQL Server, PostgreSQL, etc.).
  • Strong problem‑solving and debugging skills.
  • Experience working in Agile/Scrum teams.

 

Good to Have

  • Experience with React for frontend development.
  • Exposure to Datadog for monitoring and logging.
  • Knowledge of CI/CD tools (GitHub Actions, Jenkins, AWS CodePipeline, etc.).
  • Containerization experience (Docker, Kubernetes).

 

Soft Skills

  • Strong communication and collaboration abilities.
  • Ability to work in a fast‑paced environment.
  • Ownership mindset with a focus on delivering high‑quality solutions.

 

Skills

.NET Core, C#, AWS, Postman

 

Notice period - 0 to 15 days only

Location: Hyderabad

Virtual Interview: 7th Feb 2026

First round will be Virtual

2nd round will be F2F

Read more
Dhwani Rural Information Systems
Swati Yadav
Posted by Swati Yadav
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
0 - 3 yrs
₹2L - ₹6L / yr
skill iconPython
Data Structures
SQL
Object Oriented Programming (OOPs)
skill iconGit
+5 more

Position Overview

We are seeking an experienced ERPNext/Frappe Developer to join our dynamic team at Dhwani. The ideal candidate will have strong expertise in developing, customizing, and maintaining ERPNext applications built on the Frappe Framework. This role involves working on complex business solutions, custom module development, and ensuring seamless integration with various business processes.


Key Responsibilities

Development & Customization

  1. Design, develop, and implement custom applications and modules on the Frappe Framework and ERPNext.
  2. Customize existing ERPNext modules (Accounting, CRM, HR, Inventory, Manufacturing, etc.) to meet specific business requirements.
  3. Build custom DocTypes, forms, reports, dashboards, and print formats.
  4. Develop and maintain REST APIs for system integrations.
  5. Write clean, efficient, and well-documented code in Python and JavaScript.

Technical Implementation

  1. Understand client requirements for ERPNext and suggest optimal technical solutions
  2. Handle all aspects of development including server-side, API, and client-side logic
  3. Implement business logic using Frappe's document lifecycle hooks and controllers
  4. Develop custom web portals, web pages, and web forms
  5. Ensure smooth transitions for customizations during Frappe/ERPNext upgrades

System Management

  1. Manage ERPNext installations, configurations, and deployments
  2. Perform system updates, upgrades, and maintenance
  3. Debug and troubleshoot technical issues, providing timely solutions
  4. Work with MariaDB/MySQL databases and write complex queries
  5. Implement and manage version control using Git

Collaboration & Documentation

  1. Collaborate with business analysts and stakeholders to gather and refine requirements
  2. Write functional and development specifications
  3. Participate in code reviews and contribute to development best practices
  4. Provide technical guidance and support to junior developers


Required Qualifications

Experience

  1. Minimum 2-4 years of hands-on experience with Frappe Framework and ERPNext development and customizations
  2. Proven track record of delivering live ERPNext projects that can be showcased
  3. Experience in customizing ERPNext modules across different business domains
  4. We are also open to hire Interns (With PPO Opportunity) who demonstrates strong DSA and coding fundamentals, good understanding of Python programming, knowledge and exposure of MySQL database, strong logical thinking, problem solving skills along with interest in working on frappe framework and enthusiasm to build challenging technology solutions for social impact. High-performing interns will receive a Pre-Placement Offer (PPO) based on performance. Internship will be of 3 months with monthly stipend in between 15k-20k based on interview performance.


Technical Skills

Core Technologies:  

  1. Strong proficiency in Python programming
  2. Solid experience with JavaScript, HTML, CSS
  3. Working knowledge of Jinja templating.
  4. Experience with jQuery and Bootstrap framework

Frappe/ERPNext Expertise:  

  1. Deep understanding of Frappe Framework architecture.
  2. Experience with DocType creation, customization, and management.
  3. Knowledge of Frappe's ORM, REST API capabilities, and hooks system.
  4. Understanding of ERPNext modules and business workflows

Database & Infrastructure:  

  1. Proficient in MariaDB/MySQL database management.
  2. Experience with Linux operating systems.
  3. Knowledge of Git version control.
  4. Understanding of web server configurations and deployment.

Professional Skills

  1. Strong analytical and problem-solving abilities
  2. Excellent communication and collaboration skills
  3. Ability to work effectively in team environments
  4. Self-starter with ability to take ownership of projects
  5. Attention to detail and commitment to quality code


This is a work-from-office role in Gurgaon, Haryana

Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹27L - ₹40L / yr
skill iconPython
skill iconJava
SQL

Strong Senior Backend Engineer profiles

Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems

Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).

Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework

Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization

Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices

Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)

Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D

Mandatory (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred

Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Pune
4 - 8 yrs
₹15L - ₹25L / yr
skill iconJava
SQL
skill iconReact.js
skill iconAmazon Web Services (AWS)

Proficiency in Java 8+.


Solid understanding of REST APIs(Spring boot), microservices, databases (SQL/NoSQL), and caching systems like Redis/Aerospike.


Familiarity with cloud platforms (AWS, GCP, Azure) and DevOps tools (Docker, Kubernetes, CI/CD).


Good understanding of data structures, algorithms, and software design principles.

Read more
Estuate Software

at Estuate Software

1 candid answer
Ariba Khan
Posted by Ariba Khan
Remote, Bengaluru (Bangalore)
8 - 12 yrs
Upto ₹30L / yr (Varies
)
SQL
confluence
Business Analysis
User Research

About the company:

At Estuate, more than 400 uniquely talented people work together, to provide the world with next-generation product engineering and IT enterprise services. We help companies reimagine their business for the digital age.

Incorporated in 2005 in Milpitas (CA), we have grown to become a global organization with a truly global vision. At Estuate, we bring together talent, experience, and technology to meet our customer’s needs. Our ‘Extreme Service’ culture helps us deliver extraordinary results.


Our key to success:

We are an ISO-certified organization present across four distinct global geographies. We cater to industry verticals such as BFSI, Healthcare & Pharma, Retail & E-Commerce, and ISVs/Startups, as well as having over 2,000 projects in our portfolio.

Our solution-oriented mindset fuels our offerings, including Platform Engineering, Business Apps, and Enterprise Security & GRC.


Our culture of oneness

At Estuate, we are committed to fostering an inclusive workplace that welcomes people from diverse social circumstances. Our diverse culture shapes our success stories. Our values unite us. And, our curiosity inspires our creativity. Now, if that sounds like the place you’d like to be, we look forward to hearing more from you.


Requirements:

Technical skills

  • 8+ years of experience in a role Business or System or Functional Analyst;
  • Proficient in writing User Stories, Use Cases, Functional and Non-Functional requirements, system diagrams, wireframes;
  • Experience of working with Restful APIs (writing requirements, API usage);
  • Experience in Microservices architecture;
  • Experience of working with Agile methodologies (Scrum, Kanban);
  • Knowledge of SQL;
  • Knowledge of UML, BPMN;
  • Understanding of key UX/UI practices and processes;
  • Understanding of software development lifecycle;
  • Understanding of architecture of WEB-based application;
  • English Upper-Intermediate or higher.

 

Soft Skills

  • Excellent communication and presentation skills;
  • Proactiveness;
  • Organized, detail-oriented with ability to keep overall solution in mind;
  • Comfort working in a fast-paced environment, running concurrent projects and manage BA work with multiple stakeholders;
  • Good time-management skills, ability to handle multitasking activities.


Good to haves

  • Experience in enterprise software development or finance domain;
  • Experience in delivery of desktop and web-applications;
  • Experience of successful system integration project.

 

Responsibilities:

  • Participation in discovery phases and workshops with Customer, covering key business and product requirements;
  • Manage project scope, requirements management and their impact on existing requirements, defining dependencies on other teams;
  • Creating business requirements, user stories, mockups, functional specifications and technical requirements (incl. flow diagrams, data mappings, examples);
  • Close collaboration with development team (requirements presentation, backlog grooming, requirements change management, technical solution design together with Tech Lead, etc.);
  • Regular communication with internal (Product, Account management, Business teams) and external stakeholders (Partners, Customers);
  • Preparing UAT scenarios, validation cases;
  • User Acceptance Testing;
  • Demo for internal stakeholders;
  • Creating documentation (user guides, technical guides, presentations).

Project Description:

Wireless Standard POS (Point-Of-Sales) is our retail management solution for the Telecom Market.

It provides thousands of retailers with features and functionalities they need to run their businesses effectively with full visibility and control into every aspect of sales and operations. It is simple to learn, easy to use and as operation grows, more features can be added on.


Our system can optimize and simplify all processes related to retail in this business area.

Few things that our product can do:

  • Robust Online Reporting
  • Repair Management Software
  • 3rd Party Integrations
  • Customer Connect Marketing
  • Time and Attendance
  • Carrier Commission Reconciliation

 

 As a Business Analyst/ System Analyst, you will be the liaison between the lines of business and the Development team, have the opportunity to work on a very complex product with microservice architecture (50+ for now) and communicate with Product, QA, Developers, Architecture and Customer Support teams to help improve product quality.


Read more
A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.

Agency job
via HyrHub by Neha Koshy
Bengaluru (Bangalore)
4 - 6 yrs
₹25L - ₹40L / yr
skill iconJava
Apache Kafka
skill iconSpring Boot
SQL
NOSQL Databases
+4 more

Key Responsibilities:

  • Design, build, and own large-scale, distributed backend and platform systems.
  • Drive architectural decisions for high-throughput, low-latency services with strong scalability and reliability guarantees.
  • Build and evolve core components of a real-time Customer Data Platform, especially around data ingestion, streaming, and processing.
  • Evaluate and adopt open-source and emerging platform technologies; build prototypes where needed.
  • Own critical subsystems end-to-end, ensuring performance, maintainability, and operational excellence.
  • Mentor junior engineers and uphold high standards through code and design reviews.
  • Effectively use modern AI-assisted coding tools to accelerate development while maintaining engineering rigor.
  • 4–6 years of strong backend/platform engineering experience with solid fundamentals in algorithms, data structures, and optimizations.
  • Proven experience designing and operating production-grade distributed systems.
  • B.E / B.Tech / M.Tech / M.S / MCA in Computer Science or equivalent from premier institutes.


Qualifications:

Technical Skills:

  • Strong system and object-oriented design skills.
  • Hands-on experience with SQL and NoSQL databases.
  • Strong working knowledge of Kafka and streaming systems.
  • Proficiency in Java, concurrency, and unit/integration testing.
  • Familiarity with cloud-native environments (Kubernetes, CI/CD, observability).

AI-Assisted Engineering:

  • Hands-on experience using modern AI coding platforms such as Opencode, Claude Code, Codex, Cursor, Windsurf, or similar.
  • Ability to use AI tools for code generation, refactoring, testing, debugging, and design exploration responsibly.

Soft Skills & Nice to Have:

  • Strong ownership mindset and ability to deliver in fast-paced environments.
  • Clear written and verbal communication skills.
  • Startup experience is a plus.
Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Remote only
8 - 12 yrs
₹10L - ₹18L / yr
Windows Azure
SQL
TSQL
Powershell

Key Responsibilities

  • Architectural Leadership: Design end-to-end agentic frameworks using UiPath Agent Builder and Studio Web, moving processes away from rigid "if-then" logic to goal-oriented AI agents.
  • Agentic UI Integration: Lead the implementation of UiPath Autopilot and Agentic Orchestration to handle dynamic UI changes, unstructured data, and complex human-in-the-loop escalations.
  • Advanced AI Implementation: Deploy and fine-tune models within the UiPath AI Trust Layer, ensuring secure and governed use of LLMs (GPT-4, Claude, etc.) for real-time UI reasoning.
  • Infrastructure & Governance: Define the "Agentic Operating Model," establishing guardrails for autonomous agent behavior, security protocols, and scalability within UiPath Orchestrator.
  • Stakeholder Strategy: Partner with C-suite stakeholders to identify "Agent-First" opportunities that provide 10x ROI over traditional RPA.
  • Mentorship: Lead a CoE (Center of Excellence), upskilling senior developers in prompt engineering, context grounding, and semantic automation.

Required Technical Skills

  • Core Platform: Expert-level mastery of UiPath Studio, Orchestrator, and Cloud.
  • Agentic Specialization: Proven experience with UiPath Agent Builder, Integration Service, and Document Understanding.
  • AI/ML Integration: Deep understanding of AI Center, Vector Databases, and RAG (Retrieval-Augmented Generation) to provide agents with business context.
  • Programming: Proficiency in .NET (C#) and Python for custom activity development and AI model interfacing.
  • UI Automation: Expert knowledge of modern UI descriptors, Computer Vision, and handling "tricky" environments (Citrix, legacy SAP, mainframe).

Qualifications

  • Experience: 10+ years in Software Development/Automation, with at least 5 years specifically in UiPath Architecture.
  • Education: Bachelor’s or Master’s in Computer Science, AI, or a related field.
  • Certifications: UiPath Advanced Developer (ARD) and UiPath Solution Architect certifications are mandatory. Certifications in AI/ML (Azure AI, AWS Machine Learning) are a significant plus.
  • Mindset: A "fail-forward" approach to innovation, with the ability to prototype agentic solutions in fast-paced environments.


Read more
Blockify
Dhanur Sehgal
Posted by Dhanur Sehgal
Remote only
3 - 8 yrs
₹6L - ₹12L / yr
skill iconGo Programming (Golang)
skill iconPython
Scalability
Infrastructure architecture
SQL
+6 more

We’re hiring a remote, contract-based Backend & Infrastructure Engineer who can build and run production systems end-to-end.

You will build and scale high-throughput backend services in Golang and Python, operate ClickHouse-powered analytics at scale, manage Linux servers for maximum uptime, scalability, and reliability, and drive cost efficiency as a core engineering discipline across the entire stack.



What You Will Do:


Backend Development (Golang & Python)

  • Design and maintain high-throughput RESTful/gRPC APIs — primarily Golang, Python for tooling and supporting services
  • Architect for horizontal scalability, fault tolerance, and low-latency at scale
  • Implement caching (Redis/Memcached), rate limiting, efficient serialization, and CI/CD pipelines

Scalable Architecture & System Design

  • Design and evolve distributed, resilient backend architecture that scales without proportional cost increase
  • Make deliberate trade-offs (CAP, cost vs. performance) and design multi-region HA with automated failover

ClickHouse & Analytical Data Infrastructure

  • Deploy, tune, and operate ClickHouse clusters for real-time analytics and high-cardinality OLAP workloads
  • Design optimal table engines, partition strategies, materialized views, and query patterns
  • Manage cluster scaling, replication, schema migrations, and upstream/downstream integrations

Cost Efficiency & Cost Optimization

  • Own cost optimization end-to-end: right-sizing, reserved/spot capacity, storage tiering, query optimization, compression, batching
  • Build cost dashboards, budgets, and alerts; drive a culture of cost-aware engineering

Linux Server Management & Infrastructure

  • Administer and harden Linux servers (Ubuntu, Debian, CentOS/RHEL) — patching, security, SSH, firewalls
  • Manage VPS/bare-metal provisioning, capacity planning, and containerized workloads (Docker, Kubernetes/Nomad)
  • Implement Infrastructure-as-Code (Terraform/Pulumi); optionally manage AWS/GCP as needed

Data, Storage & Scheduling

  • Optimize SQL schemas and queries (PostgreSQL, MySQL); manage data archival, cold storage, and lifecycle policies
  • Build and maintain cron jobs, scheduled tasks, and batch processing systems

Uptime, Reliability & Observability

  • Own system uptime: zero-downtime deployments, health checks, self-healing infra, SLOs/SLIs
  • Build observability stacks (Prometheus, Grafana, Datadog, OpenTelemetry); structured logging, distributed tracing, alerting
  • Drive incident response, root cause analysis, and post-mortems


Required Qualifications:


Must-Have (Critical)

  • Deep proficiency in Golang (primary) and Python
  • Proven ability to design and build scalable, distributed architectures
  • Production experience deploying and operating ClickHouse at scale
  • Track record of driving measurable cost efficiency and cost optimization
  • 5+ years in backend engineering and infrastructure roles

Also Required

  • Strong Linux server administration (Ubuntu, Debian, CentOS/RHEL) — comfortable living in the terminal
  • Proven uptime and reliability track record across production infrastructure
  • Strong SQL (PostgreSQL, MySQL); experience with high-throughput APIs (10K+ RPS)
  • VPS/bare-metal provisioning, Docker, Kubernetes/Nomad, IaC (Terraform/Pulumi)
  • Observability tooling (Prometheus, Grafana, Datadog, OpenTelemetry)
  • Cron jobs, batch processing, data archival, cold storage management
  • Networking fundamentals (DNS, TCP/IP, load balancing, TLS)


Nice to Have

  • AWS, GCP, or other major cloud provider experience
  • Message queues / event streaming (Kafka, RabbitMQ, SQS/SNS)
  • Data pipelines (Airflow, dbt); FinOps practices
  • Open-source contributions; compliance background (SOC 2, HIPAA, GDPR)


What We Offer

  • Remote, contractual role
  • Flexible time zones (overlap for standups + incident coverage)
  • Competitive contract compensation + equity
  • Long-term engagement opportunity based on performance
Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Remote only
2 - 4 yrs
₹3L - ₹5L / yr
PowerBI
Data modeling
ETL
Spark
SQL
+1 more

Microsoft Fabric, Power BI, Data modelling, ETL, Spark SQL

Remote work- 5-7 hours

450 Rs hourly charges

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
8 - 10 yrs
Best in industry
Technical support
SQL
Apache
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
+2 more

What You’ll Do:

We are looking for a Staff Operations Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.  

This role will be in the Engineering Operations team and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.  

  • Serve as the Engineering interface between Analytics and Engineering teams.
  • Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
  • Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
  • Build “mastered” versions of the data for Analytics-specific querying use cases.
  • Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent.
  • Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
  • Implement DataOps practices.
  • Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
  • Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
  • Operate between Engineers and Analysts to unify both practices for analytics insight creation.

Who You Are:

  • 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
  • Adept in market research methodologies and using data to deliver representative insights.
  • Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
  • Deep SQL experience is a must.
  • Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
  • English Language Fluency and proven success working with teams in the U.S.
  • Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
  • Experience working with public clouds like GCP/AWS.
  • Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
  • Proficient with SQL, Python or JVM-based language, Bash.
  • Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc. 
  • Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
  • Experience in debugging UI and Backend issues will be add on.


Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
india
6 - 20 yrs
₹5L - ₹30L / yr
ADF
databricks
PySpark
SQL
skill iconPython
+2 more

Hiring : Azure Data Engineer


Experience level: 5 yrs – 12yrs

Location : Bangalore

Work arrangement : On-site

Budget Range: Flexible


Mandatory Skill :


Self-Rating (7+ is must)

ADF, Databricks , Pyspark , SQL - Mandatory

Good to have :-

Delta Live table , Python , Team handling-

Manager ( 7+yrs exp) ,

Azure functions, Unity catalog, real-time streaming , Data pipelines

Read more
The Blue Owls Solutions

at The Blue Owls Solutions

2 candid answers
Apoorvo Chakraborty
Posted by Apoorvo Chakraborty
Pune
6 - 10 yrs
₹20L - ₹30L / yr
Data governance
Data engineering
Team leadership
Data modeling
Synapse
+3 more

The Role


We are looking for a Senior/Lead Azure Data Engineer to join our team in Pune. You will be responsible for the end-to-end lifecycle of data solutions, from initial client requirement gathering and solution architecture design to leading the data engineering team through implementation. You will be the technical anchor for the project, ensuring that our data estates are scalable, governed, and high-performing.


Key Responsibilities

  • Architecture & Design: Design robust data architectures using Microsoft Fabric and Azure Synapse, focusing on Medallion architecture and metadata-driven frameworks.
  • End-to-End Delivery: Translate complex client business requirements into technical roadmaps and lead the team to deliver them on time.
  • Data Governance: Implement and manage enterprise-grade governance, data discovery, and lineage using Microsoft Purview.
  • Team Leadership: Act as the technical lead for the team, performing code reviews, mentoring junior engineers, and ensuring best practices in PySpark and SQL.
  • Client Management: Interface directly with stakeholders to define project scope and provide technical consultancy.


What We’re Looking For

  • 6+ Years in Data Engineering with at least 3+ years leading technical teams or designing architectures.
  • Expertise in Microsoft Fabric/Synapse: Deep experience with Lakehouses, Warehouses, and Spark-based processing.
  • Governance Specialist: Proven experience implementing Microsoft Purview for data cataloging, sensitivity labeling, and lineage.
  • Technical Breadth: Strong proficiency in PySpark, SQL, and Data Factory. Familiarity with Infrastructure as Code (Bicep/Terraform) is a major plus.

Why Work with Us?

  • Competitive Pay
  • Flexible Hours
  • Work on Microsoft’s latest (Fabric, Purview, Foundry) as a Designated Solutions Partner.
  • High-Stakes Impact: Solve complex, client-facing problems for enterprise leaders
  • Structured learning paths to help you master AI automation and Agentic AI.


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort