Cutshort logo
Remote sql jobs

50+ Remote SQL Jobs in India

Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Mayura Consultancy Services
Remote only
1 - 3 yrs
₹3L - ₹5L / yr
skill iconPHP
skill iconCodeIgniter
SQL
skill iconBootstrap
skill iconJavascript
+1 more

Position: Full Stack Developer ( PHP Codeigniter)

Company : Mayura Consultancy Services

Experience: 2 yrs

Location : Bangalore

Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI

Work Location: Work From Home(WFH)

Apply: Please apply for the job opening using the URL below, based on your skill set. Once you complete the application form, we will review your profile.

Website:

https://www.mayuraconsultancy.com/careers/mcs-full-stack-web-developer-opening?r=jlp

Read more
Automate Accounts

at Automate Accounts

2 candid answers
Namrata Das
Posted by Namrata Das
Remote only
2 - 4 yrs
₹5L - ₹10L / yr
zoho
skill iconPython
skill iconNodeJS (Node.js)
SQL
skill iconDocker
+1 more

Responsibilities


Develop and maintain web and backend components using Python, Node.js, and Zoho tools


Design and implement custom workflows and automations in Zoho


Perform code reviews to maintain quality standards and best practices


Debug and resolve technical issues promptly


Collaborate with teams to gather and analyze requirements for effective solutions


Write clean, maintainable, and well-documented code


Manage and optimize databases to support changing business needs


Contribute individually while mentoring and supporting team members


Adapt quickly to a fast-paced environment and meet expectations within the first month



Selection Process


1. HR Screening: Review of qualifications and experience


2. Online Technical Assessment: Test coding and problem-solving skills


3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho


4. Leadership Evaluation: Evaluate team collaboration and leadership abilities


5. Management Interview: Discuss cultural fit and career opportunities


6. Offer Discussion: Finalize compensation and role specifics



Experience Required


2-4 years of relevant experience as a Zoho Developer


Proven ability to work as a self-starter and contribute individually


Strong technical and interpersonal skills to support team members effectively



Read more
Remote only
0 - 1 yrs
₹1L - ₹1.5L / yr
skill iconPHP
SQL
Databases
skill iconAmazon Web Services (AWS)
Relational Database (RDBMS)
+4 more

Qualification- BTech-CS (2025 graduate only)

Joining: Immediate Joiner

Job Type: Trainee

Work Mode: Remote

Working Days: Monday to Friday

Shift (Rotational – based on project need):

·      5:00 PM – 2:00 AM IST

·      6:00 PM – 3:00 AM IST

 

Job Summary

ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.

 

Eligibility & Qualifications

  • Education:
  • B.Tech (Computer Science) / M.Tech (Computer Science)
  • Tier 1 colleges preferred
  • Experience Level: Fresher
  • Communication: Excellent English communication skills (verbal & written)

Skills Required

Technical & Development Skills:

·       Basic understanding of AI / Machine Learning concepts

·       Exposure to AWS (deployment or cloud fundamentals)

·       PHP development

·       WordPress development and customization

·       JavaScript (ES5 / ES6+)

·       jQuery

·       AJAX calls and asynchronous handling

·       Event handling

·       HTML5 & CSS3

·       Client-side form validation

 

Work Environment & Tools

  • Comfortable working in a remote setup
  • Familiarity with collaboration and remote access tools

 

Additional Requirements (Work-from-Home Setup)

This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:

  • System: Laptop or Desktop (Windows-based)
  • Operating System: Windows
  • Screen Size: Minimum 14 inches
  • Screen Resolution: Full HD (1920 × 1080)
  • Processor: Intel i5 or higher
  • RAM: Minimum 8 GB (Mandatory)
  • Software: AnyDesk
  • Internet Speed: 100 Mbps or higher

 

About ARDEM

 

ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.

 



Read more
Remote only
3 - 6 yrs
₹4L - ₹7L / yr
skill iconNodeJS (Node.js)
skill iconPHP
skill iconReact Native
SQL
skill iconJavascript
+6 more

Software Developer (Node.js / PHP / React Native)

Experience: 3+ Years

Employment Type: Full-Time


Role Summary


We are looking for a skilled software developer with 3+ years of experience to work on enterprise platforms in EdTech, HRMS, CRM, and online examination systems. The role involves developing scalable web and mobile applications used by institutions and organizations.


Key Responsibilities

• Develop and maintain backend services using Node.js and PHP.

• Build and enhance mobile applications using React Native.

• Design and integrate REST APIs and third-party services.

• Work with databases (MySQL/PostgreSQL) for performance-driven applications.

• Collaborate with product, QA, and implementation teams for feature delivery.

• Troubleshoot, optimize, and ensure secure, high-performance systems.


Required Skills

• Strong experience in Node.js, PHP, and React Native.

• Good knowledge of JavaScript, API development, and database design.

• Experience with Git, version control, and deployment processes.

• Understanding of SaaS-based applications and modular architecture.


Preferred

• Experience in ERP, HRMS, CRM, or education/examination platforms.

• Familiarity with cloud environments and scalable deployments.


Qualification: B.Tech / MCA / BCA / Equivalent


Read more
CNV Labs India Pvt Ltd iCloudEMS
Shital ICloudEMS
Posted by Shital ICloudEMS
Remote only
3 - 5 yrs
₹3L - ₹5L / yr
skill iconPHP
SQL
skill iconNodeJS (Node.js)
skill iconReact Native
edtech

Role Summary


We are looking for a skilled Software Developer with 3+ years of experience to work on enterprise platforms in EdTech, HRMS, CRM, and Online Examination Systems. The role involves developing scalable web and mobile applications used by institutions and organizations.


Key Responsibilities

• Develop and maintain backend services using Node.js and PHP.

• Build and enhance mobile applications using React Native.

• Design and integrate REST APIs and third-party services.

• Work with databases (MySQL/PostgreSQL) for performance-driven applications.

• Collaborate with product, QA, and implementation teams for feature delivery.

• Troubleshoot, optimize, and ensure secure, high-performance systems.


Required Skills

• Strong experience in Node.js, PHP, and React Native.

• Good knowledge of JavaScript, API development, and database design.

• Experience with Git, version control, and deployment processes.

• Understanding of SaaS-based applications and modular architecture.


Preferred

• Experience in ERP, HRMS, CRM, or Education/Examination platforms.

• Familiarity with cloud environments and scalable deployments.


Qualification: B.Tech / MCA / BCA / Equivalent

Apply: Share your resume with project details and current CTC.

Read more
CNV Labs India Pvt Ltd iCloudEMS
Shital ICloudEMS
Posted by Shital ICloudEMS
Remote only
4 - 8 yrs
₹4L - ₹8L / yr
skill iconPHP
skill iconNodeJS (Node.js)
skill iconReact Native
SQL

We are looking for a skilled Node.js Developer with Rect Native experience to build, enhance, and maintain ERP and EdTech platforms. The role involves developing scalable backend services, integrating ERP modules, and supporting education-focused systems such as LMS, student management, exams, and fee management.


Key Responsibilities


Develop and maintain backend services using Node.js,Rect Native,PHP.


Build and integrate ERP modules for EdTech platforms (Admissions, Students, Exams, Attendance, Fees, Reports).


Design and consume RESTful APIs and third-party integrations (payment gateway, SMS, email).


Work with databases (MySQL / MongoDB / PostgreSQL) for high-volume education data.


Optimize application performance, scalability, and security.


Collaborate with frontend, QA, and product teams.


Debug, troubleshoot, and provide production support.


Required Skills


Strong experience in Node.js (Express.js / NestJS).


Working experience in PHP (Core PHP / Laravel / CodeIgniter).


Hands-on experience with ERP systems.


Domain experience in EdTech / Education ERP / LMS.


Strong knowledge of MySQL and database design.


Experience with authentication, role-based access, and reporting.


Familiarity with Git, APIs, and server environments.


Preferred Skills


Experience with online examination systems.


Knowledge of cloud platforms (AWS / Azure).


Understanding of security best practices (CSRF, XSS, SQL Injection).


Exposure to microservices or modular architecture.


Qualification


Bachelor’s degree in Computer Science or equivalent experience.


3–6 years of relevant experience in Node.js & PHP development


Skills:- NodeJS (Node.js), PHP, ERP management, EdTech, MySQL, API and Amazon Web Services (AWS)



Read more
Mango Sciences
Remote only
5 - 7 yrs
₹10L - ₹15L / yr
skill iconPython
SQL
SQL quires

Database Programmer / Developer (SQL, Python, Healthcare)

Job Summary

We are seeking a skilled and experienced Database Programmer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining our database systems, with a strong focus on data integrity, performance, and security. The role requires expertise in SQL, strong programming skills in Python, and prior experience working within the healthcare domain to handle sensitive data and complex regulatory requirements.

Key Responsibilities

  • Design, implement, and maintain scalable and efficient database schemas and systems.
  • Develop and optimize complex SQL queries, stored procedures, and triggers for data manipulation and reporting.
  • Write and maintain Python scripts to automate data pipelines, ETL processes, and database tasks.
  • Collaborate with data analysts, software developers, and other stakeholders to understand data requirements and deliver robust solutions.
  • Ensure data quality, integrity, and security, adhering to industry standards and regulations such as HIPAA.
  • Troubleshoot and resolve database performance issues, including query tuning and indexing.
  • Create and maintain technical documentation for database architecture, processes, and applications.

Required Qualifications

  • Experience:
  • Proven experience as a Database Programmer, SQL Developer, or a similar role.
  • Demonstrable experience working with database systems, including data modeling and design.
  • Strong background in developing and maintaining applications and scripts using Python.
  • Direct experience within the healthcare domain is mandatory, including familiarity with medical data (e.g., patient records, claims data) and related regulatory compliance (e.g., HIPAA).
  • Technical Skills:
  • Expert-level proficiency in Structured Query Language (SQL) and relational databases (e.g., SQL Server, PostgreSQL, MySQL).
  • Solid programming skills in Python, including experience with relevant libraries for data handling (e.g., Pandas, SQLAlchemy).
  • Experience with data warehousing concepts and ETL (Extract, Transform, Load) processes.
  • Familiarity with version control systems, such as Git.

Preferred Qualifications

  • Experience with NoSQL databases (e.g., MongoDB, Cassandra).
  • Knowledge of cloud-based data platforms (e.g., AWS, GCP, Azure).
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Familiarity with other programming languages relevant to data science or application development.

Education

  • Bachelor’s degree in computer science, Information Technology, or a related field.

 

To process your resume for the next process, please fill out the Google form with your updated resume.


https://forms.gle/f7zgYAa632ww5Teb6

Read more
Remote only
2 - 4 yrs
₹3L - ₹4L / yr
skill icon.NET
SQL
skill iconPostgreSQL
RESTful APIs
skill iconGit
+4 more

We are looking for a highly skilled Full Stack Developer to design and scale our real-time vehicle tracking platform. You will be responsible for building high-performance web applications that process live GPS data and visualize it through interactive map interfaces.

Key Responsibilities

Real-Time Data Processing: Develop robust back-end services to ingest and process high-frequency GPS data from IoT devices.

Map Integration: Design and implement interactive map interfaces using tools like Google Maps API or Mapbox for real-time asset visualization.

Geofencing & Alerts: Build server-side logic for complex geospatial features, including geofencing, route optimization, and automated speed/entry alerts.

API Development: Create and maintain scalable RESTful or GraphQL APIs to bridge communication between vehicle hardware, the database, and the user dashboard.

Database Management: Architect and optimize databases (e.g., PostgreSQL with PostGIS) for efficient storage and querying of spatial-temporal data.

Performance Optimization: Ensure high availability and low-latency response times for tracking thousands of simultaneous vehicle connections.

Required Technical Skills

Front-End: Proficiency in React.js, Angular, or Vue.js, with experience in state management (Redux/MobX).

Back-End: Strong experience in Node.js (Express/NestJS), Python (Django/Flask), or Java (Spring Boot).

Mapping: Hands-on experience with Google Maps SDK, Leaflet, or OpenLayers.

Real-time Communication: Expertise in WebSockets or Socket.IO for live data streaming.

Databases: Proficiency in SQL (PostgreSQL/MySQL) and NoSQL (MongoDB/Redis) for caching.

Cloud & DevOps: Familiarity with AWS (EC2, Lambda), Docker, and Kubernetes for scalable deployment.

Qualifications

Education: Bachelor’s or Master’s degree in Computer Science or a related field.

Experience: 3–6+ years of professional full-stack development experience.

Niche Knowledge: Prior experience with telematics, IoT protocols (MQTT, HTTP), or GPS-based applications is highly preferred.

Read more
Remote only
0 - 1 yrs
₹1L - ₹1.8L / yr
skill icon.NET
SQL
SQL server
skill iconjQuery
LINQ
+3 more

Position: .Net Core Intern (.Net Core Knowledge is must)

Education: BTech-Computer Science Only

Joining: Immediate Joiner

Work Mode: Remote

Working Days: Monday to Friday

Shift: Rotational – based on project need):

·      5:00 PM – 2:00 AM IST

·      6:00 PM – 3:00 AM IST

 

Job Summary

ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.

 

Eligibility & Qualifications

  • Education:
  • B.Tech (Computer Science) / M.Tech (Computer Science)
  • Tier 1 colleges preferred
  • Experience Level: Fresher
  • Communication: Excellent English communication skills (verbal & written)

Skills Required

1. Technical Skills (Must Have)

  • Experience with .NET Core (.NET 6 / 7 / 8)
  • Strong knowledge of C#, including:
  • Object-Oriented Programming (OOP) concepts
  • async/await
  • LINQ
  • ASP.NET Core (Web API / MVC)

2. Database Skills

  • SQL Server (preferred)
  • Writing complex SQL queries, joins, and subqueries
  • Stored Procedures, Functions, and Indexes
  • Database design and performance tuning
  • Entity Framework Core
  • Migrations and transaction handling

3. Frontend Skills (Required)

  • JavaScript (ES5 / ES6+)
  • jQuery
  • DOM manipulation
  • AJAX calls
  • Event handling
  • HTML5 & CSS3
  • Client-side form validation

4. Security & Performance

  • Data validation and exception handling
  • Caching concepts (In-memory / Redis – good to have)

5. Tools & Environment

  • Visual Studio / VS Code
  • Git (GitHub / Azure DevOps)
  • Basic knowledge of server deployment

6. Good to Have (Optional)

  • Azure or AWS deployment experience
  • CI/CD pipelines
  • Docker
  • Experience with data handling

 

Work Environment & Tools

  • Comfortable working in a remote setup
  • Familiarity with collaboration and remote access tools

 

Additional Requirements (Work-from-Home Setup)

This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:

  • System: Laptop or Desktop (Windows-based)
  • Operating System: Windows
  • Screen Size: Minimum 14 inches
  • Screen Resolution: Full HD (1920 × 1080)
  • Processor: Intel i5 or higher
  • RAM: Minimum 8 GB (Mandatory)
  • Software: AnyDesk
  • Internet Speed: 100 Mbps or higher

 

About ARDEM

 

ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.

 

Read more
suntekai
Kushi A
Posted by Kushi A
Remote only
0 - 1 yrs
₹10000 - ₹12000 / mo
skill iconPython
skill iconPostgreSQL
Data Visualization
Business Intelligence (BI)
SQL
+2 more

Job Description: Data Analyst


About the Role

We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..

Key Responsibilities

  • Data Extraction & Management

  • Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.

  • Ensure accuracy, reliability, and consistency of data across different platforms.

  • Data Analysis & Insights

  • Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.

  • Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.

  • Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.

  • Business Intelligence & Visualization

  • Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).

  • Create visualizations that simplify complex datasets for stakeholders and management.

  • Python (Pandas)

  • Use Python (Pandas, NumPy) for advanced analytics.

  • Collaboration & Stakeholder Management

  • Work closely with product, operations, and leadership teams to provide insights that drive decision-making.

  • Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.

Required Skills

  • SQL/PostgreSQL

  • Complex joins, window functions, CTEs, aggregations, query optimization.

  • Python (Pandas & Analytics)

  • Data wrangling, cleaning, transformations, exploratory data analysis (EDA).

  • Libraries: Pandas, NumPy, Matplotlib, Seaborn

  • Data Visualization & BI Tools

  • Expertise in creating dashboards and reports using Metabase or Looker.

  • Ability to translate raw data into meaningful visual insights.

  • Business Intelligence

  • Strong analytical reasoning to connect data insights with e-commerce KPIs.

  • Experience in funnel analysis, customer journey mapping, and retention analysis.

  • Analytics & E-commerce Knowledge

  • Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.

  • General Skills

  • Strong communication and presentation skills.

  • Ability to work cross-functionally in fast-paced environments.

  • Problem-solving mindset with attention to detail.



Education: Bachelor’s degree in Data Science, Computer Science, data processing




Read more
Reliable Group

at Reliable Group

2 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
10yrs+
Upto ₹42L / yr (Varies
)
skill icon.NET
.NET Compact Framework
SQL
Windows Azure
CI/CD
+5 more

Application Architect – .NET

Role Overview

We are looking for a senior, hands-on Application Architect with deep .NET experience who can fix and modernize our current systems and build a strong engineering team over time.

Important – This role hands-on with architectural mindset. This person should be comfortable working with legacy systems and can make and explain tradeoffs.


Key Responsibilities

Application Architecture & Modernization

  • Own application architecture across legacy .NET Framework and modern .NET systems
  • Review the existing application, and drive an incremental modernization approach along with new feature development as per business growth of the company.
  • Own the gradual move away from outdated patterns (Web Forms, tightly coupled MVC, legacy UI constructs)
  • Define clean API contracts between front-end and backend services
  • Identify and resolve performance bottlenecks across code and database layers
  • Improve data access patterns, caching strategies, and system responsiveness
  • Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.


Backend, APIs & Integrations

  • Design scalable backend services and APIs
  • Improve how newer .NET services interact with legacy systems
  • Lead integrations with external systems, including Zoho
  • Prior experience integrating with Zoho (CRM, Finance, or other modules) is a strong value add
  • Experience designing and implementing integrations using EDI standards


Data & Schema Design

  • Review existing database schemas and core data structures
  • Redesign data models to support growth, and reporting/analytics requirements
  • Optimize SǪL queries to reduce the load on execution and DB engine


Cloud Awareness

  • Design applications with cloud deployment in mind (primarily Azure)
  • Understand how to use Azure services to improve security, scalability, and availability
  • Work with Cloud and DevOps teams to ensure application architecture aligns with cloud best practices
  • Push for CI/CD automation so that team pushes code regularly and makes progress.


Team Leadership & Best Practices

  • Act as a technical leader and mentor for the engineering team
  • Help hire, onboard, and grow a team under this role over time.
  • Define KPIs and engineering best practices (including focus on documentation)
  • Set coding standards, architectural guidelines, and review practices
  • Improve testability and long-term health of the codebase
  • Raise the overall engineering bar through reviews, coaching, and clear standards
  • Create a culture of ownership and quality


Cross-Platform Thinking

  • Strong communicator who can convert complex tech topics into business-friendly lingo. Understands the business needs and importance of user experience
  • While .NET is the core stack, contribute to architecture decisions across platforms
  • Leverages AI tools to accelerate design, coding, reviews, and troubleshooting while maintaining high quality


Skills and Experience

  • 12+ years of hands-on experience in application development (preferably on .NET stack)
  • Experience leading technical direction while remaining hands-on
  • Deep expertise in .NET Framework (4.x) and modern .NET (.NET Core / .NET 6+)
  • Must have lead a project to modernize legacy system – preferably moving from .NET Framework to .NET Core.
  • Experience with MVC, Web Forms, and legacy UI patterns
  • Solid backend and API design experience
  • Strong understanding of database design and schema evolution
  • Understanding of Analytical systems – OLAP, Data warehousing, data lakes.
  • Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.
  • Integration with Zoho would be a plus.
Read more
Remote only
9 - 12 yrs
₹2L - ₹2.5L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
Terraform
Data Transformation Tool (DBT)
SQL
+1 more

🚀 Hiring: Associate Tech Architect / Senior Tech Specialist

🌍 Remote | Contract Opportunity

We’re looking for a seasoned tech professional who can lead the design and implementation of cloud-native data and platform solutions. This is a remote, contract-based role for someone with strong ownership and architecture experience.

🔴 Mandatory & Most Important Skill Set

Hands-on expertise in the following technologies is essential:

AWS – Cloud architecture & services

Python – Backend & data engineering

Terraform – Infrastructure as Code

Airflow – Workflow orchestration

SQL – Data processing & querying

DBT – Data transformation & modeling

💼 Key Responsibilities

  • Architect and build scalable AWS-based data platforms
  • Design and manage ETL/ELT pipelines
  • Orchestrate workflows using Airflow
  • Implement cloud infrastructure using Terraform
  • Lead best practices in data architecture, performance, and scalability
  • Collaborate with engineering teams and provide technical leadership

🎯 Ideal Profile

✔ Strong experience in cloud and data platform architecture

✔ Ability to take end-to-end technical ownership

✔ Comfortable working in a remote, distributed team environment

📄 Role Type: Contract

🌍 Work Mode: 100% Remote

If you have deep expertise in these core technologies and are ready to take on a high-impact architecture role, we’d love to hear from you.


Read more
Estuate Software

at Estuate Software

1 candid answer
Ariba Khan
Posted by Ariba Khan
Remote, Bengaluru (Bangalore)
8 - 12 yrs
Upto ₹30L / yr (Varies
)
SQL
confluence
Business Analysis
User Research

About the company:

At Estuate, more than 400 uniquely talented people work together, to provide the world with next-generation product engineering and IT enterprise services. We help companies reimagine their business for the digital age.

Incorporated in 2005 in Milpitas (CA), we have grown to become a global organization with a truly global vision. At Estuate, we bring together talent, experience, and technology to meet our customer’s needs. Our ‘Extreme Service’ culture helps us deliver extraordinary results.


Our key to success:

We are an ISO-certified organization present across four distinct global geographies. We cater to industry verticals such as BFSI, Healthcare & Pharma, Retail & E-Commerce, and ISVs/Startups, as well as having over 2,000 projects in our portfolio.

Our solution-oriented mindset fuels our offerings, including Platform Engineering, Business Apps, and Enterprise Security & GRC.


Our culture of oneness

At Estuate, we are committed to fostering an inclusive workplace that welcomes people from diverse social circumstances. Our diverse culture shapes our success stories. Our values unite us. And, our curiosity inspires our creativity. Now, if that sounds like the place you’d like to be, we look forward to hearing more from you.


Requirements:

Technical skills

  • 8+ years of experience in a role Business or System or Functional Analyst;
  • Proficient in writing User Stories, Use Cases, Functional and Non-Functional requirements, system diagrams, wireframes;
  • Experience of working with Restful APIs (writing requirements, API usage);
  • Experience in Microservices architecture;
  • Experience of working with Agile methodologies (Scrum, Kanban);
  • Knowledge of SQL;
  • Knowledge of UML, BPMN;
  • Understanding of key UX/UI practices and processes;
  • Understanding of software development lifecycle;
  • Understanding of architecture of WEB-based application;
  • English Upper-Intermediate or higher.

 

Soft Skills

  • Excellent communication and presentation skills;
  • Proactiveness;
  • Organized, detail-oriented with ability to keep overall solution in mind;
  • Comfort working in a fast-paced environment, running concurrent projects and manage BA work with multiple stakeholders;
  • Good time-management skills, ability to handle multitasking activities.


Good to haves

  • Experience in enterprise software development or finance domain;
  • Experience in delivery of desktop and web-applications;
  • Experience of successful system integration project.

 

Responsibilities:

  • Participation in discovery phases and workshops with Customer, covering key business and product requirements;
  • Manage project scope, requirements management and their impact on existing requirements, defining dependencies on other teams;
  • Creating business requirements, user stories, mockups, functional specifications and technical requirements (incl. flow diagrams, data mappings, examples);
  • Close collaboration with development team (requirements presentation, backlog grooming, requirements change management, technical solution design together with Tech Lead, etc.);
  • Regular communication with internal (Product, Account management, Business teams) and external stakeholders (Partners, Customers);
  • Preparing UAT scenarios, validation cases;
  • User Acceptance Testing;
  • Demo for internal stakeholders;
  • Creating documentation (user guides, technical guides, presentations).

Project Description:

Wireless Standard POS (Point-Of-Sales) is our retail management solution for the Telecom Market.

It provides thousands of retailers with features and functionalities they need to run their businesses effectively with full visibility and control into every aspect of sales and operations. It is simple to learn, easy to use and as operation grows, more features can be added on.


Our system can optimize and simplify all processes related to retail in this business area.

Few things that our product can do:

  • Robust Online Reporting
  • Repair Management Software
  • 3rd Party Integrations
  • Customer Connect Marketing
  • Time and Attendance
  • Carrier Commission Reconciliation

 

 As a Business Analyst/ System Analyst, you will be the liaison between the lines of business and the Development team, have the opportunity to work on a very complex product with microservice architecture (50+ for now) and communicate with Product, QA, Developers, Architecture and Customer Support teams to help improve product quality.


Read more
Blockify
Dhanur Sehgal
Posted by Dhanur Sehgal
Remote only
3 - 8 yrs
₹6L - ₹12L / yr
skill iconGo Programming (Golang)
skill iconPython
Scalability
Infrastructure architecture
SQL
+6 more

We’re hiring a remote, contract-based Backend & Infrastructure Engineer who can build and run production systems end-to-end.

You will build and scale high-throughput backend services in Golang and Python, operate ClickHouse-powered analytics at scale, manage Linux servers for maximum uptime, scalability, and reliability, and drive cost efficiency as a core engineering discipline across the entire stack.



What You Will Do:


Backend Development (Golang & Python)

  • Design and maintain high-throughput RESTful/gRPC APIs — primarily Golang, Python for tooling and supporting services
  • Architect for horizontal scalability, fault tolerance, and low-latency at scale
  • Implement caching (Redis/Memcached), rate limiting, efficient serialization, and CI/CD pipelines

Scalable Architecture & System Design

  • Design and evolve distributed, resilient backend architecture that scales without proportional cost increase
  • Make deliberate trade-offs (CAP, cost vs. performance) and design multi-region HA with automated failover

ClickHouse & Analytical Data Infrastructure

  • Deploy, tune, and operate ClickHouse clusters for real-time analytics and high-cardinality OLAP workloads
  • Design optimal table engines, partition strategies, materialized views, and query patterns
  • Manage cluster scaling, replication, schema migrations, and upstream/downstream integrations

Cost Efficiency & Cost Optimization

  • Own cost optimization end-to-end: right-sizing, reserved/spot capacity, storage tiering, query optimization, compression, batching
  • Build cost dashboards, budgets, and alerts; drive a culture of cost-aware engineering

Linux Server Management & Infrastructure

  • Administer and harden Linux servers (Ubuntu, Debian, CentOS/RHEL) — patching, security, SSH, firewalls
  • Manage VPS/bare-metal provisioning, capacity planning, and containerized workloads (Docker, Kubernetes/Nomad)
  • Implement Infrastructure-as-Code (Terraform/Pulumi); optionally manage AWS/GCP as needed

Data, Storage & Scheduling

  • Optimize SQL schemas and queries (PostgreSQL, MySQL); manage data archival, cold storage, and lifecycle policies
  • Build and maintain cron jobs, scheduled tasks, and batch processing systems

Uptime, Reliability & Observability

  • Own system uptime: zero-downtime deployments, health checks, self-healing infra, SLOs/SLIs
  • Build observability stacks (Prometheus, Grafana, Datadog, OpenTelemetry); structured logging, distributed tracing, alerting
  • Drive incident response, root cause analysis, and post-mortems


Required Qualifications:


Must-Have (Critical)

  • Deep proficiency in Golang (primary) and Python
  • Proven ability to design and build scalable, distributed architectures
  • Production experience deploying and operating ClickHouse at scale
  • Track record of driving measurable cost efficiency and cost optimization
  • 5+ years in backend engineering and infrastructure roles

Also Required

  • Strong Linux server administration (Ubuntu, Debian, CentOS/RHEL) — comfortable living in the terminal
  • Proven uptime and reliability track record across production infrastructure
  • Strong SQL (PostgreSQL, MySQL); experience with high-throughput APIs (10K+ RPS)
  • VPS/bare-metal provisioning, Docker, Kubernetes/Nomad, IaC (Terraform/Pulumi)
  • Observability tooling (Prometheus, Grafana, Datadog, OpenTelemetry)
  • Cron jobs, batch processing, data archival, cold storage management
  • Networking fundamentals (DNS, TCP/IP, load balancing, TLS)


Nice to Have

  • AWS, GCP, or other major cloud provider experience
  • Message queues / event streaming (Kafka, RabbitMQ, SQS/SNS)
  • Data pipelines (Airflow, dbt); FinOps practices
  • Open-source contributions; compliance background (SOC 2, HIPAA, GDPR)


What We Offer

  • Remote, contractual role
  • Flexible time zones (overlap for standups + incident coverage)
  • Competitive contract compensation + equity
  • Long-term engagement opportunity based on performance
Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Remote only
2 - 4 yrs
₹3L - ₹5L / yr
PowerBI
Data modeling
ETL
Spark
SQL
+1 more

Microsoft Fabric, Power BI, Data modelling, ETL, Spark SQL

Remote work- 5-7 hours

450 Rs hourly charges

Read more
Euphoric Thought Technologies
Remote, Bengaluru (Bangalore)
3 - 4 yrs
₹11L - ₹13L / yr
skill iconPython
SQL

We are seeking a Data Engineer with 3–4 years of relevant experience to join our team. The ideal candidate should have strong expertise in Python and SQL and be available to join immediately.

Location: Bangalore

Experience: 3–4 Years

Joining: Immediate Joiner preferred

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and data models
  • Extract, transform, and load (ETL) data from multiple sources
  • Write efficient and optimized SQL queries for data analysis and reporting
  • Develop data processing scripts and automation using Python
  • Ensure data quality, integrity, and performance across systems
  • Collaborate with cross-functional teams to support business and analytics needs
  • Troubleshoot data-related issues and optimize existing processes

Required Skills & Qualifications:

  • 3–4 years of hands-on experience as a Data Engineer or similar role
  • Strong proficiency in Python and SQL
  • Experience working with relational databases and large datasets
  • Good understanding of data warehousing and ETL concepts
  • Strong analytical and problem-solving skills
  • Ability to work independently and in a team-oriented environment

Preferred:

  • Experience with cloud platforms or data tools (added advantage)
  • Exposure to performance tuning and data optimization





Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Remote only
9 - 18 yrs
₹5L - ₹29L / yr
skill iconPython
SQL
NOSQL Databases
DBA

Job Summary


We are looking for an experienced Python DBA with strong expertise in Python scripting and SQL/NoSQL databases. The candidate will be responsible for database administration, automation, performance optimization, and ensuring availability and reliability of database systems.


Key Responsibilities

  • Administer and maintain SQL and NoSQL databases
  • Develop Python scripts for database automation and monitoring
  • Perform database performance tuning and query optimization
  • Manage backups, recovery, replication, and high availability
  • Ensure data security, integrity, and compliance
  • Troubleshoot and resolve database-related issues
  • Collaborate with development and infrastructure teams
  • Monitor database health and performance
  • Maintain documentation and best practices


Required Skills

  • 10+ years of experience in Database Administration
  • Strong proficiency in Python
  • Experience with SQL databases (PostgreSQL, MySQL, Oracle, SQL Server)
  • Experience with NoSQL databases (MongoDB, Cassandra, etc.)
  • Strong understanding of indexing, schema design, and performance tuning
  • Good analytical and problem-solving skills


Read more
Forbes Advisor

at Forbes Advisor

3 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
4yrs+
Upto ₹27L / yr (Varies
)
Google Cloud Platform (GCP)
Data Transformation Tool (DBT)
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
+6 more

Forbes Advisor is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.

We do this by combining data-driven content, rigorous product comparisons, and user-first design all built on top of a modern, scalable platform. Our teams operate globally and bring deep expertise across journalism, product, performance marketing, and analytics.

The Role

We are hiring a Senior Data Engineer to help design and scale the infrastructure behind our analytics,performance marketing, and experimentation platforms.

This role is ideal for someone who thrives on solving complex data problems, enjoys owning systems end-to-end, and wants to work closely with stakeholders across product, marketing, and analytics.

You’ll build reliable, scalable pipelines and models that support decision-making and automation at every level of the business.


What you’ll do

● Build, maintain, and optimize data pipelines using Spark, Kafka, Airflow, and Python

● Orchestrate workflows across GCP (GCS, BigQuery, Composer) and AWS-based systems

● Model data using dbt, with an emphasis on quality, reuse, and documentation

● Ingest, clean, and normalize data from third-party sources such as Google Ads, Meta,Taboola, Outbrain, and Google Analytics

● Write high-performance SQL and support analytics and reporting teams in self-serve data access

● Monitor and improve data quality, lineage, and governance across critical workflows

● Collaborate with engineers, analysts, and business partners across the US, UK, and India


What You Bring

● 4+ years of data engineering experience, ideally in a global, distributed team

● Strong Python development skills and experience

● Expert in SQL for data transformation, analysis, and debugging

● Deep knowledge of Airflow and orchestration best practices

● Proficient in DBT (data modeling, testing, release workflows)

● Experience with GCP (BigQuery, GCS, Composer); AWS familiarity is a plus

● Strong grasp of data governance, observability, and privacy standards

● Excellent written and verbal communication skills


Nice to have

● Experience working with digital marketing and performance data, including:

Google Ads, Meta (Facebook), TikTok, Taboola, Outbrain, Google Analytics (GA4)

● Familiarity with BI tools like Tableau or Looker

● Exposure to attribution models, media mix modeling, or A/B testing infrastructure

● Collaboration experience with data scientists or machine learning workflows


Why Join Us

● Monthly long weekends — every third Friday off

● Wellness reimbursement to support your health and balance

● Paid parental leave

● Remote-first with flexibility and trust

● Work with a world-class data and marketing team inside a globally recognized brand

Read more
Hudson Data

at Hudson Data

1 recruiter
MadanLal Gupta
Posted by MadanLal Gupta
Remote only
6 - 10 yrs
₹9L - ₹12L / yr
skill iconPython
SQL
skill iconGoogle Analytics
Linux/Unix
Google Cloud Platform (GCP)
+1 more

About Hudson Data


At Hudson Data, we view AI as both an art and a science. Our cross-functional teams — spanning business leaders, data scientists, and engineers — blend AI/ML and Big Data technologies to solve real-world business challenges. We harness predictive analytics to uncover new revenue opportunities, optimize operational efficiency, and enable data-driven transformation for our clients.


Beyond traditional AI/ML consulting, we actively collaborate with academic and industry partners to stay at the forefront of innovation. Alongside delivering projects for Fortune 500 clients, we also develop proprietary AI/ML products addressing diverse industry challenges.


Headquartered in New Delhi, India, with an office in New York, USA, Hudson Data operates globally, driving excellence in data science, analytics, and artificial intelligence.



About the Role


We are seeking a Data Analyst & Modeling Specialist with a passion for leveraging AI, machine learning, and cloud analytics to improve business processes, enhance decision-making, and drive innovation. You’ll play a key role in transforming raw data into insights, building predictive models, and delivering data-driven strategies that have real business impact.



Key Responsibilities


1.⁠ ⁠Data Collection & Management

• Gather and integrate data from multiple sources including databases, APIs, spreadsheets, and cloud warehouses.

• Design and maintain ETL pipelines ensuring data accuracy, scalability, and availability.

• Utilize any major cloud platform (Google Cloud, AWS, or Azure) for data storage, processing, and analytics workflows.

• Collaborate with engineering teams to define data governance, lineage, and security standards.


2.⁠ ⁠Data Cleaning & Preprocessing

• Clean, transform, and organize large datasets using Python (pandas, NumPy) and SQL.

• Handle missing data, duplicates, and outliers while ensuring consistency and quality.

• Automate data preparation using Linux scripting, Airflow, or cloud-native schedulers.


3.⁠ ⁠Data Analysis & Insights

• Perform exploratory data analysis (EDA) to identify key trends, correlations, and drivers.

• Apply statistical techniques such as regression, time-series analysis, and hypothesis testing.

• Use Excel (including pivot tables) and BI tools (Tableau, Power BI, Looker, or Google Data Studio) to develop insightful reports and dashboards.

• Present findings and recommendations to cross-functional stakeholders in a clear and actionable manner.


4.⁠ ⁠Predictive Modeling & Machine Learning

• Build and optimize predictive and classification models using scikit-learn, XGBoost, LightGBM, TensorFlow, Keras, and H2O.ai.

• Perform feature engineering, model tuning, and cross-validation for performance optimization.

• Deploy and manage ML models using Vertex AI (GCP), AWS SageMaker, or Azure ML Studio.

• Continuously monitor, evaluate, and retrain models to ensure business relevance.


5.⁠ ⁠Reporting & Visualization

• Develop interactive dashboards and automated reports for performance tracking.

• Use pivot tables, KPIs, and data visualizations to simplify complex analytical findings.

• Communicate insights effectively through clear data storytelling.


6.⁠ ⁠Collaboration & Communication

• Partner with business, engineering, and product teams to define analytical goals and success metrics.

• Translate complex data and model results into actionable insights for decision-makers.

• Advocate for data-driven culture and support data literacy across teams.


7.⁠ ⁠Continuous Improvement & Innovation

• Stay current with emerging trends in AI, ML, data visualization, and cloud technologies.

• Identify opportunities for process optimization, automation, and innovation.

• Contribute to internal R&D and AI product development initiatives.



Required Skills & Qualifications


Technical Skills

• Programming: Proficient in Python (pandas, NumPy, scikit-learn, XGBoost, LightGBM, TensorFlow, Keras, H2O.ai).

• Databases & Querying: Advanced SQL skills; experience with BigQuery, Redshift, or Azure Synapse is a plus.

• Cloud Expertise: Hands-on experience with one or more major platforms — Google Cloud, AWS, or Azure.

• Visualization & Reporting: Skilled in Tableau, Power BI, Looker, or Excel (pivot tables, data modeling).

• Data Engineering: Familiarity with ETL tools (Airflow, dbt, or similar).

• Operating Systems: Strong proficiency with Linux/Unix for scripting and automation.


Soft Skills

• Strong analytical, problem-solving, and critical-thinking abilities.

• Excellent communication and presentation skills, including data storytelling.

• Curiosity and creativity in exploring and interpreting data.

• Collaborative mindset, capable of working in cross-functional and fast-paced environments.



Education & Certifications

• Bachelor’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.

• Master’s degree in Data Analytics, Machine Learning, or Business Intelligence preferred.

• Relevant certifications are highly valued:

• Google Cloud Professional Data Engineer

• AWS Certified Data Analytics – Specialty

• Microsoft Certified: Azure Data Scientist Associate

• TensorFlow Developer Certificate



Why Join Hudson Data


At Hudson Data, you’ll be part of a dynamic, innovative, and globally connected team that uses cutting-edge tools — from AI and ML frameworks to cloud-based analytics platforms — to solve meaningful problems. You’ll have the opportunity to grow, experiment, and make a tangible impact in a culture that values creativity, precision, and collaboration.


Read more
Remote only
1 - 3 yrs
₹2L - ₹5L / yr
skill iconPHP
skill iconCodeIgniter
skill iconLaravel
SQL
skill iconBootstrap
+1 more

Position: Full Stack Developer ( PHP Codeigniter)

Company : Mayura Consultancy Services

Experience: 2 yrs

Location : Bangalore

Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI

Work Location: Work From Home(WFH)

Apply: Please apply for the job opening using the URL below, based on your skill set. Once you complete the application form, we will review your profile.

Website:

https://www.mayuraconsultancy.com/careers/mcs-full-stack-web-developer-opening?r=jlp


Requirements :

  • Prior experience in Full Stack Development using PHP Codeigniter


Perks of Working with MCS :

  • Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
  • Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
  • Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
  • Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
  • Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.


Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.

Read more
Remote only
5 - 18 yrs
₹8L - ₹25L / yr
Dynamics 365
Data migration
SSIS
Azure fabric
SQL
+1 more

Role - Dynamics 365 Data Migration Engineer/Developer

Experience level: 5+ years

Location: Remote


Prior experience in Dynamics 365 data migration projects.


knowledge of SSIS, Azure Fabric, and Azure Data Factory.


Good understanding of Dataverse data structure and integration patterns.


Proficiency in SQL for data extraction and transformation.


Experience in preparing data mapping and migration documentation.


Collaborate with functional teams for data validation and reconciliation.


Prepare data mapping documents and ensure accurate transformation

Read more
Hudson Data

at Hudson Data

1 recruiter
MadanLal Gupta
Posted by MadanLal Gupta
Remote only
5 - 8 yrs
₹10L - ₹15L / yr
Google Cloud Platform (GCP)
skill iconData Analytics
Bigquerry
Pub
SQL
+3 more

About the Role


Hudson Data is looking for a Senior / Mid-Level SQL Engineer to design, build, optimize, and manage our data platforms. This role requires strong hands-on expertise in SQL, Google Cloud Platform (GCP), and Linux to support high-performance, scalable data solutions.


We are also hiring Python Programers / Software Developers / Front end and Back End Engineers


Key Responsibilities:


1.⁠Develop and optimize complex SQL queries, views, and stored procedures


  • Build and maintain data pipelines and ETL workflows on GCP (e.g., BigQuery, Cloud SQL)

  • Manage database performance, monitoring, and troubleshooting

  • Work extensively in Linux environments for deployments and automation

  • Partner with data, product, and engineering teams on data initiatives


Required Skills & Qualifications

Must-Have Skills (Essential)


  • Expert GCP mandatory

  • Strong Linux / shell scripting mandatory

Nice to Have

  • Experience with data warehousing and ETL frameworks

  • Python / scripting for automation

  • Performance tuning and query optimization experience


Soft Skills

  • Strong analytical, problem-solving, and critical-thinking abilities.
  • Excellent communication and presentation skills, including data storytelling.
  • Curiosity and creativity in exploring and interpreting data.
  • Collaborative mindset, capable of working in cross-functional and fast-paced environments.


Education & Certifications

  • Bachelors degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.
  • Masters degree in Data Analytics, Machine Learning, or Business Intelligence preferred.


Why Join Hudson Data

At Hudson Data, youll be part of a dynamic, innovative, and globally connected team that uses cutting-edge tools from AI and ML frameworks to cloud-based analytics platforms to solve meaningful problems. Youll have the opportunity to grow, experiment, and make a tangible impact in a culture that values creativity, precision, and collaboration.

Read more
Springer Capital
Andrew Rose
Posted by Andrew Rose
Remote only
0 - 0 yrs
₹5000 - ₹7000 / mo
Data Visualization
skill iconData Analytics
Microsoft BI
SQL
Microsoft Excel
+5 more

The Power BI Intern will assist the analytics team in using Microsoft Power BI to create interactive dashboards and reports. Working with actual datasets to assist well-informed business decision-making, this position provides practical exposure to data analysis, visualization, and business intelligence techniques.

Read more
httpswwwicloudemscomvlog
AMISHA SRIVASTAVA
Posted by AMISHA SRIVASTAVA
Remote only
3 - 6 yrs
₹4L - ₹10L / yr
skill iconPHP
SQL
skill iconNodeJS (Node.js)
skill iconMongoDB
skill iconPostgreSQL
+6 more


We are seeking a highly skilled software developer with proven experience in developing and scaling education ERP solutions. The ideal candidate should have strong expertise in Node.js or PHP (Laravel), MySQL, and MongoDB, along with hands-on experience in implementing ERP modules such as HR, Exams, Inventory, Learning Management System (LMS), Admissions, Fee Management, and Finance.


Key Responsibilities

Design, develop, and maintain scalable Education ERP modules.

Work on end-to-end ERP features, including HR, exams, inventory, LMS, admissions, fees, and finance.

Build and optimize REST APIs/GraphQL services and ensure seamless integrations.

Optimize system performance, scalability, and security for high-volume ERP usage.

Conduct code reviews, enforce coding standards, and mentor junior developers.

Stay updated with emerging technologies and recommend improvements for ERP solutions.


Required Skills & Qualifications

Strong expertise in Node.js and PHP (Laravel, Core PHP).

Proficiency with MySQL, MongoDB, and PostgreSQL (database design & optimization).

Frontend knowledge: JavaScript, jQuery, HTML, CSS (React/Vue preferred).

Experience with REST APIs, GraphQL, and third-party integrations (payment gateways, SMS, and email).

Hands-on with Git/GitHub, Docker, and CI/CD pipelines.


Familiarity with cloud platforms (AWS, Azure, GCP) is a plus.

4+ years of professional development experience, with a minimum of 2 years in ERP systems.

Preferred Experience


Prior work in the education ERP domain.

Deep knowledge of HR, Exam, Inventory, LMS, Admissions, Fees & Finance modules.

Exposure to high-traffic enterprise applications.

Strong leadership, mentoring, and problem-solving abilities


Benefit:

Permanent Work From Home

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Remote only
4 - 6 yrs
₹4.5L - ₹15L / yr
skill icon.NET
ASP.NET
skill iconC#
SQL
Microservices
+3 more

𝐇𝐢 𝐂𝐨𝐧𝐧𝐞𝐜𝐭𝐢𝐨𝐧𝐬! 👋 𝐖𝐞𝐥𝐜𝐨𝐦𝐞 𝐭𝐨 2026! 🎉

Starting the new year with an exciting opportunity!

Deqode 𝐈𝐒 𝐇𝐈𝐑𝐈𝐍𝐆! 💻


Hiring: .Net Developer

⭐ Experience: 4+ Years

⭐ Work Mode: Remote

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


🔧 Role Overview

We are looking for passionate .NET Developers to design, develop, and maintain scalable microservices for enterprise-grade applications. You’ll work closely with cross-functional teams and clients on high-performance, cloud-native solutions.


🛠️ Key Responsibilities

✅Build and maintain scalable .NET microservices

✅Develop secure, high-quality RESTful Web APIs

✅Write unit and integration tests to ensure code quality

✅Optimize performance and implement caching strategies


💫 Must-Have Skills

✅ 4+ years of experience with .NET Core / .NET 5+ & C#

✅Strong hands-on experience with ASP.NET Core Web API & EF Core

✅REST API development & middleware implementation

✅Solid understanding of SOLID principles & design patterns

✅Unit testing experience (xUnit, NUnit, MSTest, Moq)


Read more
TrumetricAI
Yashika Tiwari
Posted by Yashika Tiwari
Remote only
5 - 8 yrs
₹15L - ₹20L / yr
skill iconJava
SQL
skill iconSpring Boot

Java Tech Lead (5–6 Years Experience)

About the Role

We are seeking a highly skilled Java Tech Lead with 5–6 years of hands-on experience in backend engineering, architecture design, and leading development teams. 

The ideal candidate will combine strong technical expertise in Java frameworks with a deep understanding of system design, scalability, and performance optimization.

This role involves technical leadership, code reviews, and architectural decision-making for complex enterprise systems — with occasional exposure to analytics-driven and Python-based components.

Key Responsibilities

  • Architect, design, and develop scalable backend systems using Java (Quarkus, Spring Boot, Spring, Java EE).
  • Own the architecture — ensure modular, extensible, and high-performance service design.
  • Lead and mentor a team of developers; conduct code reviews, enforce best practices, and ensure high code quality.
  • Collaborate with cross-functional teams (frontend, DevOps, product, data) to deliver integrated, end-to-end solutions.
  • Design and optimize database schemas (MySQL, PostgreSQL) and ensure efficient query performance.
  • Implement and maintain microservices and distributed systems with strong fault tolerance and observability.
  • Drive the adoption of modern development workflows — Git branching strategy, CI/CD, and code quality automation.
  • Analyze system performance bottlenecks, implement monitoring, and ensure smooth production deployments.
  • Contribute to architecture reviews, technical documentation, and design discussions.
  • Occasionally contribute to Python-based analytics modules or automation scripts.
  • Work with AWS cloud services (EC2, S3, RDS, Lambda) for deployment, scaling, and infrastructure automation.

Required Skills & Qualifications

  • 5–6 years of professional experience in backend application development using Java.
  • Strong proficiency in Java frameworks: Quarkus, Spring Boot, Spring, Java EE.
  • Proven experience in architecture design, system decomposition, and microservices design principles.
  • Solid understanding of object-oriented design (OOD), design patterns, and SOLID principles.
  • Strong experience with relational databases (MySQL, PostgreSQL) and query optimization.
  • Good understanding of event-driven systems, RESTful APIs, and asynchronous processing.
  • Proficiency in Git for version control and team collaboration.
  • Strong analytical and debugging skills; ability to diagnose complex production issues.

Good to Have

  • Hands-on experience with Python for data processing or analytics integrations.
  • Familiarity with AWS cloud architecture and cost optimization practices.
  • Experience with CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI).
  • Knowledge of Docker/Kubernetes for containerized deployments.
  • Exposure to NoSQL databases (MongoDB, DynamoDB, Cassandra).
  • Experience with message queues (Kafka, RabbitMQ, or AWS SQS).
  • Understanding of system scalability, caching (Redis/Memcached), and observability stacks (Prometheus, Grafana, ELK).

Soft Skills

  • Strong leadership, mentoring, and communication skills.
  • Proven ability to drive technical decisions and balance short-term delivery with long-term architectural health.
  • Collaborative mindset — works closely with product, design, and operations teams.
  • Passion for clean architecture, high performance, and continuous improvement.
  • Self-driven with a strong sense of ownership and accountability.


Read more
-
Remote only
8 - 13 yrs
₹10L - ₹33L / yr
python
PySpark
Big Data
SQL

Role: Lead Data Engineer Core

Responsibilities: Lead end-to-end design, development, and delivery of complex cloud-based data pipelines.

Collaborate with architects and stakeholders to translate business requirements into technical data solutions.

Ensure scalability, reliability, and performance of data systems across environments. Provide mentorship and technical leadership to data engineering teams. Define and enforce best practices for data modeling, transformation, and governance.


Optimize data ingestion and transformation frameworks for efficiency and cost management. Contribute to data architecture design and review sessions across projects.


Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

8+ years of experience in data engineering with proven leadership in designing cloud native data systems.


Strong expertise in Python, SQL, Apache Spark, and at least one cloud platform (Azure, AWS, or GCP). Experience with Big Data, DataLake, DeltaLake, and Lakehouse architectures Proficient in one or more database technologies (e.g. PostgreSQL, Redshift, Snowflake, and NoSQL databases).


Ability to recommend and implement scalable data pipelines Preferred Qualifications: Cloud certification (AWS, Azure, or GCP). Experience with Databricks, Snowflake, or Terraform. Familiarity with data governance, lineage, and observability tools. Strong collaboration skills and ability to influence data-driven decisions across teams.

Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
Remote only
8 - 15 yrs
₹7L - ₹30L / yr
java
Fullstack Developer
skill iconAngular (2+)
skill iconSpring Boot
SQL
+2 more

Java Angular Fullstack Developer

 

Job Description:


Technical Lead – Full Stack

Experience: 8–12 years (Strong candidates Java 50% - Angular 50%)

Location – remote 

Pf no is mandatory 



Tech Stack: Java, Spring Boot, Microservices, Angular, SQL

Focus: Hands-on coding, solution design, team leadership, delivery ownership

 

Must-Have Skills (Depth)



Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.

Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.

Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.

React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).

SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.

Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.

DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.

Read more
Remote only
5 - 10 yrs
₹25L - ₹55L / yr
Data engineering
Databases
skill iconPython
SQL
skill iconPostgreSQL
+4 more

Role: Full-Time, Long-Term Required: Python, SQL Preferred: Experience with financial or crypto data


OVERVIEW

We are seeking a data engineer to join as a core member of our technical team. This is a long-term position for someone who wants to build robust, production-grade data infrastructure and grow with a small, focused team. You will own the data layer that feeds our machine learning pipeline—from ingestion and validation through transformation, storage, and delivery.


The ideal candidate is meticulous about data quality, thinks deeply about failure modes, and builds systems that run reliably without constant attention. You understand that downstream ML models are only as good as the data they consume.


CORE TECHNICAL REQUIREMENTS

Python (Required): Professional-level proficiency. You write clean, maintainable code for data pipelines—not throwaway scripts. Comfortable with Pandas, NumPy, and their performance characteristics. You know when to use Python versus push computation to the database.


SQL (Required): Advanced SQL skills. Complex queries, query optimization, schema design, execution plans. PostgreSQL experience strongly preferred. You think about indexing, partitioning, and query performance as second nature.


Data Pipeline Design (Required): You build pipelines that handle real-world messiness gracefully. You understand idempotency, exactly-once semantics, backfill strategies, and incremental versus full recomputation tradeoffs. You design for failure—what happens when an upstream source is late, returns malformed data, or goes down entirely. Experience with workflow orchestration required: Airflow, Prefect, Dagster, or similar.


Data Quality (Required): You treat data quality as a first-class concern. You implement validation checks, anomaly detection, and monitoring. You know the difference between data that is missing versus data that should not exist. You build systems that catch problems before they propagate downstream.


WHAT YOU WILL BUILD

Data Ingestion: Pipelines pulling from diverse sources—crypto exchanges, traditional market feeds, on-chain data, alternative data. Handling rate limits, API quirks, authentication, and source-specific idiosyncrasies.


Data Validation: Checks ensuring completeness, consistency, and correctness. Schema validation, range checks, freshness monitoring, cross-source reconciliation.


Transformation Layer: Converting raw data into clean, analysis-ready formats. Time series alignment, handling different frequencies and timezones, managing gaps.


Storage and Access: Schema design optimized for both write patterns (ingestion) and read patterns (ML training, feature computation). Data lifecycle and retention management.

Monitoring and Alerting: Observability into pipeline health. Knowing when something breaks before it affects downstream systems.


DOMAIN EXPERIENCE

Preference for candidates with experience in financial or crypto data—understanding market data conventions, exchange-specific quirks, and point-in-time correctness. You know why look-ahead bias is dangerous and how to prevent it.


Time series data at scale—hundreds of symbols with years of history, multiple frequencies, derived features. You understand temporal joins, windowed computations, and time-aligned data challenges.


High-dimensional feature stores—we work with hundreds of thousands of derived features. Experience managing, versioning, and serving large feature sets is valuable.


ENGINEERING STANDARDS

Reliability: Pipelines run unattended. Failures are graceful with clear errors, not silent corruption. Recovery is straightforward.


Reproducibility: Same inputs and code version produce identical outputs. You version schemas, track lineage, and can reconstruct historical states.


Documentation: Schemas, data dictionaries, pipeline dependencies, operational runbooks. Others can understand and maintain your systems.


Testing: You write tests for pipelines—validation logic, transformation correctness, edge cases. Untested pipelines are broken pipelines waiting to happen.


TECHNICAL ENVIRONMENT

PostgreSQL, Python, workflow orchestration (flexible on tool), cloud infrastructure (GCP preferred but flexible), Git.


WHAT WE ARE LOOKING FOR

Attention to Detail: You notice when something is slightly off and investigate rather than ignore.


Defensive Thinking: You assume sources will send bad data, APIs will fail, schemas will change. You build accordingly.


Self-Direction: You identify problems, propose solutions, and execute without waiting to be told.


Long-Term Orientation: You build systems you will maintain for years.


Communication: You document clearly, explain data issues to non-engineers, and surface problems early.


EDUCATION

University degree in a quantitative/technical field preferred: Computer Science, Mathematics, Statistics, Engineering. Equivalent demonstrated expertise also considered.


TO APPLY

Include: (1) CV/resume, (2) Brief description of a data pipeline you built and maintained, (3) Links to relevant work if available, (4) Availability and timezone.

Read more
Analytical Brains Education
Remote only
1 - 5 yrs
₹8L - ₹12L / yr
skill iconPython
Shell Scripting
Powershell
SQL
skill iconJava

Job Description

We are looking for motivated IT professionals with at least one year of industry experience. The ideal candidate should have hands-on experience in AWS, Azure, AI, or Cloud technologies, or should be enthusiastic and ready to upskill and shift to new and emerging technologies. This role is primarily remote; however, candidates may be required to visit the office occasionally for meetings or project needs.

Key Requirements

  • Minimum 1 year of experience in the IT industry
  • Exposure to AWS / Azure / AI / Cloud platforms (any one or more)
  • Willingness to learn and adapt to new technologies
  • Strong problem-solving and communication skills
  • Ability to work independently in a remote setup
  • Must have a proper work-from-home environment (laptop, stable internet, quiet workspace)

Education Qualification

  • B.Tech / BE / MCA / M.Sc (IT) / equivalent


Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
4yrs+
Upto ₹23L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
skill iconNodeJS (Node.js)
skill iconJava
+1 more

The Senior Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.

The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.


Key Responsibilities

  • Analyst Workflows: Design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
  • Designing and Developing APIs: Design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
  • AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
  • Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
  • Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
  • Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
  • Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
  • Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
  • Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
  • Problem Solving: troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
  • Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.

Desired Skills and Experience

  • Development: 5+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
  • AWS Services: proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
  • Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
  • Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
  • Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
  • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
  • Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
  • Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
  • Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
  • Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
  • Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.


Read more
CSI Interfusion
Sujitha Kotipalli
Posted by Sujitha Kotipalli
Remote, Hyderabad
5 - 10 yrs
₹35L - ₹45L / yr
skill iconReact.js
skill iconC#
skill icon.NET
SQL
Microsoft Windows Azure

1、Job Responsibilities:

Backend Development (.NET)

  • Design and implement ASP.NET Core WebAPIs
  • Design and implement background jobs using Azure Function Apps
  • Optimize performance for long-running operations, ensuring high concurrency and system stability.
  • Develop efficient and scalable task scheduling solutions to execute periodic tasks

Frontend Development (React)

  • Build high-performance, maintainable React applications and optimize component rendering.
  • Continuously improving front-end performance using best practices
  • Deployment & Operations
  • Deploy React applications on Azure platforms (Azure Web Apps), ensuring smooth and reliable delivery.
  • Collaborate with DevOps teams to enhance CI/CD pipelines and improve deployment efficiency.

2、Job Requirements:

Tech Stack:

  • Backend: ASP.NET Core Web API, C#
  • Frontend: React, JavaScript/TypeScript, Redux or other state management libraries
  • Azure: Function Apps, Web Apps, Logic Apps
  • Database: Cosmos DB, SQL Server

Strong knowledge of asynchronous programmingperformance optimization, and task scheduling

  • Proficiency in React performance optimization techniques, understanding of virtual DOM and component lifecycle.​
  • Experience with cloud deployment, preferably Azure App Service or Azure Static Web Apps.​
  • Familiarity with Git and CI/CD workflows, with strong coding standards.

3、Project Background:

Mission: Transform Microsoft Cloud customers into fans by delivering exceptional support and engagement.​

  • Scope:
  • Customer reliability engineering
  • Advanced cloud engineering and supportability
  • Business management and operations
  • Product and platform orchestration​
  • Activities:
  • Technical skilling programs
  • AI strategy for customer experience
  • Handling escalations and service reliability issues​

4、Project Highlights:

React Js, ASP.NET Core Web API; Azure Function Apps, Cosmos DB

 

Read more
Appler
Appler Solutions
Posted by Appler Solutions
Remote only
4 - 6 yrs
₹7L - ₹10L / yr
skill iconJavascript
skill iconReact Native
skill iconReact.js
skill iconNodeJS (Node.js)
SQL

Job Title: Sr. Frontend Developer (Javascript)

Location: Remote Only

Experience Required: 4–6 years

Salary Range: 7L – 10L per year

About the Role:

We are looking for an experienced Sr. Frontend Developer with strong expertise in Javascript to join our remote team. The ideal candidate will have 4–6 years of hands-on experience in frontend development, with a focus on building responsive, high-performance web applications. You will work closely with cross-functional teams to design, develop, and implement user-facing features that align with business goals and enhance user experience.

Key Responsibilities:

  • Develop and maintain scalable, reusable frontend components and applications using modern Javascript frameworks and libraries.
  • Collaborate with UI/UX designers, product managers, and backend developers to deliver seamless user experiences.
  • Optimize applications for maximum speed, scalability, and accessibility.
  • Write clean, modular, and well-documented code following best practices.
  • Participate in code reviews, sprint planning, and agile development processes.
  • Troubleshoot, debug, and resolve frontend-related issues.
  • Stay updated with emerging frontend technologies and industry trends.

Must-Have Skills:

  • Javascript (ES6+)
  • React.js
  • React Native
  • NodeJS (Node.js)
  • SQL

Nice-to-Have Skills:

  • Experience with state management libraries (Redux, Context API, etc.)
  • Familiarity with testing frameworks (Jest, Cypress, React Testing Library)
  • Knowledge of frontend build tools (Webpack, Babel, NPM/Yarn)
  • Understanding of RESTful APIs and GraphQL
  • Experience with version control systems (Git)
  • Familiarity with CI/CD pipelines and deployment processes

Qualifications:

  • 4–6 years of professional frontend development experience.
  • Proven track record of delivering high-quality, production-ready applications.
  • Strong understanding of responsive design, cross-browser compatibility, and web performance optimization.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work independently in a remote environment and communicate effectively with distributed teams.

What We Offer:

  • Competitive salary within the range of 7L – 10L per year.
  • Fully remote work flexibility.
  • Opportunity to work on innovative projects with a talented and supportive team.
  • Professional growth and skill development opportunities.


Read more
Vy Systems

at Vy Systems

1 recruiter
Kalki K
Posted by Kalki K
Remote only
4 - 12 yrs
₹18L - ₹28L / yr
databricks
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
PySpark

Job Summary


We are seeking an experienced Databricks Developer with strong skills in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred), Azure. The role involves designing, developing, and optimizing scalable data pipelines and analytics workflows on the Databricks platform.


Key Responsibilities

- Develop and optimize ETL/ELT pipelines using Databricks and PySpark.

- Build scalable data workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse).

- Implement and manage Delta Lake (ACID, schema evolution, time travel).

- Write efficient, complex SQL for transformation and analytics.

- Build and support batch and streaming ingestion (Kafka, Kinesis, EventHub).

- Optimize Databricks clusters, jobs, notebooks, and PySpark performance.

- Collaborate with cross-functional teams to deliver reliable data solutions.

- Ensure data governance, security, and compliance.

- Troubleshoot pipelines and support CI/CD deployments.


Required Skills & Experience

- 4–8 years in Data Engineering / Big Data development.

- Strong hands-on experience with Databricks (clusters, jobs, workflows).

- Advanced PySpark and strong Python skills.

- Expert-level SQL (complex queries, window functions).

- Practical experience with AWS (preferred) or Azure cloud services.

- Experience with Delta Lake, Parquet, and data lake architectures.

- Familiarity with CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).

- Good understanding of data modeling, optimization, and distributed systems.

Read more
Remote only
5 - 15 yrs
₹10L - ₹15L / yr
FastAPI
skill iconPython
RESTful APIs
SQL
NOSQL Databases
+5 more


Summary:

We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.

Job Type:

Full-Time Contractor (12 months)

Location:

Remote / On-site (Jaipur preferred, as per project needs)

Experience:

5+ years in backend development

Key Responsibilities:

  • Design, develop, and maintain robust backend services using Python and FastAPI.
  •  Implement and manage Prisma ORM for database operations.
  • Build scalable APIs and integrate with SQL databases and third-party services.
  • Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
  • Collaborate with front-end developers and other team members to deliver high-quality web applications.
  • Ensure application performance, security, and reliability.
  • Participate in code reviews, testing, and deployment processes.

Required Skills:

  • Expertise in Python backend development with strong experience in FastAPI.
  • Solid understanding of RESTful API design and implementation.
  • Proficiency in SQL databases and ORM tools (preferably Prisma)
  • Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
  • Familiarity with CI/CD pipelines and containerization (Docker).
  • Knowledge of cloud architecture best practices.

Added Advantage:

  • Front-end development knowledge (React, Angular, or similar frameworks).
  • Exposure to AWS/GCP cloud platforms.
  • Experience with NoSQL databases.

Eligibility:

  • Minimum 5 years of professional experience in backend development.
  • Available for full-time engagement.
  • Please excuse if you are currently engaged in other projects—we require dedicated availability.

 

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Remote only
5 - 7 yrs
₹10L - ₹25L / yr
Windows Azure
Data engineering
SQL
CI/CD
databricks

Role: Senior Data Engineer (Azure)

Experience: 5+ Years

Location: Anywhere in india

Work Mode: Remote

Notice Period - Immediate joiners or Serving notice period

𝐊𝐞𝐲 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:

  • Data processing on Azure using ADF, Streaming Analytics, Event Hubs, Azure Databricks, Data Migration Services, and Data Pipelines
  • Provisioning, configuring, and developing Azure solutions (ADB, ADF, ADW, etc.)
  • Designing and implementing scalable data models and migration strategies
  • Working on distributed big data batch or streaming pipelines (Kafka or similar)
  • Developing data integration & transformation solutions for structured and unstructured data
  • Collaborating with cross-functional teams for performance tuning and optimization
  • Monitoring data workflows and ensuring compliance with governance and quality standards
  • Driving continuous improvement through automation and DevOps practices

𝐌𝐚𝐧𝐝𝐚𝐭𝐨𝐫𝐲 𝐒𝐤𝐢𝐥𝐥𝐬 & 𝐄𝐱𝐩𝐞𝐫𝐢𝐞𝐧𝐜𝐞:

  • 5–10 years of experience as a Data Engineer
  • Strong proficiency in Azure Databricks, PySpark, Python, SQL, and Azure Data Factory
  • Experience in Data Modelling, Data Migration, and Data Warehousing
  • Good understanding of database structure principles and schema design
  • Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms
  • Experience with DevOps tools (Azure DevOps, Jenkins, Airflow, Azure Monitor) — good to have
  • Knowledge of distributed data processing and real-time streaming (Kafka/Event Hub)
  • Familiarity with visualization tools like Power BI or Tableau
  • Strong analytical, problem-solving, and debugging skills
  • Self-motivated, detail-oriented, and capable of managing priorities effectively


Read more
Appiness Interactive
Remote only
6 - 10 yrs
₹10L - ₹14L / yr
skill iconPython
skill iconDjango
FastAPI
skill iconFlask
pandas
+9 more

Position Overview: The Lead Software Architect - Python & Data Engineering is a senior technical leadership role responsible for designing and owning end-to-end architecture for data-intensive, AI/ML, and analytics platforms, while mentoring developers and ensuring technical excellence across the organization. 


Key Responsibilities: 

  • Design end-to-end software architecture for data-intensive applications, AI/ML pipelines, and analytics platforms
  • Evaluate trade-offs between competing technical approaches 
  • Define data models, API approach, and integration patterns across systems 
  • Create technical specifications and architecture documentation 
  • Lead by example through production-grade Python code and mentor developers on engineering fundamentals 
  • Conduct design and code reviews focused on architectural soundness 
  • Establish engineering standards, coding practices, and design patterns for the team 
  • Translate business requirements into technical architecture 
  • Collaborate with data scientists, analysts, and other teams to design integrated solutions 
  • Whiteboard and defend system design and architectural choices 
  • Take responsibility for system performance, reliability, and maintainability 
  • Identify and resolve architectural bottlenecks proactively 


Required Skills:  

  • 8+ years of experience in software architecture and development  
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 
  • Strong foundations in data structures, algorithms, and computational complexity 
  • Experience in system design for scale, including caching strategies, load balancing, and asynchronous processing  
  • 6+ years of Python development experience 
  • Deep knowledge of Django, Flask, or FastAPI 
  • Expert understanding of Python internals including GIL and memory management 
  • Experience with RESTful API design and event-driven architectures (Kafka, RabbitMQ) 
  • Proficiency in data processing frameworks such as Pandas, Apache Spark, and Airflow 
  • Strong SQL optimization and database design experience (PostgreSQL, MySQL, MongoDB)  Experience with AWS, GCP, or Azure cloud platforms 
  • Knowledge of containerization (Docker) and orchestration (Kubernetes) 
  • Hands-on experience designing CI/CD pipelines Preferred (Bonus) 


Skills

  • Experience deploying ML models to production (MLOps, model serving, monitoring) Understanding of ML system design including feature stores and model versioning 
  • Familiarity with ML frameworks such as scikit-learn, TensorFlow, and PyTorch  
  • Open-source contributions or technical blogging demonstrating architectural depth 
  • Experience with modern front-end frameworks for full-stack perspective


Read more
Ekloud INC
Kratika Agarwal
Posted by Kratika Agarwal
Remote only
8 - 14 yrs
₹6L - ₹14L / yr
m365
MS SharePoint
sharepoint online
ms team
exchange online
+5 more

Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.

Read more
Ekloud INC
Seema KK
Posted by Seema KK
Remote only
8 - 12 yrs
₹23L - ₹25L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconReact.js
SQL
+9 more

Job Description:

Technical Lead – Full Stack

Experience: 8–12 years (Strong candidates Java 50% - React 50%)

Location – Bangalore/Hyderabad

Interview Levels – 3 Rounds

Tech Stack: Java, Spring Boot, Microservices, React, SQL

Focus: Hands-on coding, solution design, team leadership, delivery ownership

 

Must-Have Skills (Depth)

Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.

Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.

Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.

React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).

SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.

Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.

DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.

Read more
Neuvamacro Technology Pvt Ltd
Remote only
5 - 10 yrs
₹13L - ₹18L / yr
PowerBI
Office 365
Microsoft Dynamics
skill iconAmazon Web Services (AWS)
skill iconJavascript
+10 more

We are seeking a highly skilled Power Platform Developer with deep expertise in designing, developing, and deploying solutions using Microsoft Power Platform. The ideal candidate will have strong knowledge of Power Apps, Power Automate, Power BI, Power Pages, and Dataverse, along with integration capabilities across Microsoft 365, Azure, and third-party systems.


Key Responsibilities

  • Solution Development:
  • Design and build custom applications using Power Apps (Canvas & Model-Driven).
  • Develop automated workflows using Power Automate for business process optimization.
  • Create interactive dashboards and reports using Power BI for data visualization and analytics.
  • Configure and manage Dataverse for secure data storage and modelling.
  • Develop and maintain Power Pages for external-facing portals.
  • Integration & Customization:
  • Integrate Power Platform solutions with Microsoft 365, Dynamics 365, Azure services, and external APIs.
  • Implement custom connectors and leverage Power Platform SDK for advanced scenarios.
  • Utilize Azure Functions, Logic Apps, and REST APIs for extended functionality.
  • Governance & Security:
  • Apply best practices for environment management, ALM (Application Lifecycle Management), and solution deployment.
  • Ensure compliance with security, data governance, and licensing guidelines.
  • Implement role-based access control and manage user permissions.
  • Performance & Optimization:
  • Monitor and optimize app performance, workflow efficiency, and data refresh strategies.
  • Troubleshoot and resolve technical issues promptly.
  • Collaboration & Documentation:
  • Work closely with business stakeholders to gather requirements and translate them into technical solutions.
  • Document architecture, workflows, and processes for maintainability.


Required Skills & Qualifications

  • Technical Expertise:
  • Strong proficiency in Power Apps (Canvas & Model-Driven)Power AutomatePower BIPower Pages, and Dataverse.
  • Experience with Microsoft 365, Dynamics 365, and Azure services.
  • Knowledge of JavaScript, TypeScript, C#, .NET, and Power Fx for custom development.
  • Familiarity with SQL, DAX, and data modeling.
  • Additional Skills:
  • Understanding of ALM practicessolution packaging, and deployment pipelines.
  • Experience with Git, Azure DevOps, or similar tools for version control and CI/CD.
  • Strong problem-solving and analytical skills.
  • Certifications (Preferred):
  • Microsoft Certified: Power Platform Developer Associate.
  • Microsoft Certified: Power Platform Solution Architect Expert.


Soft Skills

  • Excellent communication and collaboration skills.
  • Ability to work in agile environments and manage multiple priorities.
  • Strong documentation and presentation abilities.

 

Read more
Upland Software

at Upland Software

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
7yrs+
Upto ₹33L / yr (Varies
)
skill icon.NET
SQL
Object Oriented Programming (OOPs)
Windows Azure
ASP.NET
+1 more

We are looking for an enthusiastic and dynamic individual to join Upland India as a Senior Software Engineer I (Backend) for our Panviva product. The individual will work with our global development team.


What would you do?

  • Develop, Review, test and maintain application code
  • Collaborating with other developers and product to fulfil objectives
  • Troubleshoot and diagnose issues
  • Take lead on tasks as needed
  • Jump in and help the team deliver features when it is required

What are we looking for?

Experience

  • 5 + years of experience in Designing and implementing application architecture
  • Back-end developer who enjoys solving problems
  • Demonstrated experience with the .NET ecosystem (.NET Framework, ASP.NET, .NET Core) & SQL server
  • Experience in building cloud-native applications (Azure)
  • Must be skilled at writing Quality, scalable, maintainable, testable code

Leadership Skills

  • Strong communication skills
  • Ability to mentor/lead junior developers


Primary Skills: The candidate must possess the following primary skills:

  • Strong Back-end developer who enjoys solving problems
  • Solid experience NET Core, SQL Server, and .Net Design patterns such as Strong Understanding of OOPs Principles, .net specific implementation (DI/CQRS/Repository etc., patterns) & Knowing Architectural Solid principles, Unit testing tools, Debugging techniques
  • Applying patterns to improve scalability and reduce technical debt
  • Experience with refactoring legacy codebases using design patterns
  • Real-World Problem Solving
  • Ability to analyze a problem and choose the most suitable design pattern
  • Experience balancing performance, readability, and maintainability
  • Experience building modern, scalable, reliable applications on the MS Azure cloud including services such as:
  • App Services
  • Azure Service Bus/ Event Hubs
  • Azure API Management Service Azure Bot Service
  • Function/Logic Apps
  • Azure key vault & Azure Configuration Service
  • CosmosDB, Mongo DB
  • Azure Search
  • Azure Cognitive Services

Understanding Agile Methodology and Tool Familiarity

  • Solid understanding of Agile development processes, including sprint planning, daily stand-ups, retrospectives, and backlog grooming
  • Familiarity with Agile tools such as JIRA for tracking tasks, managing workflows, and collaborating across teams
  • Experience working in cross-functional Agile teams and contributing to iterative development cycles

Secondary Skills: It would be advantageous if the candidate also has the following secondary skills:

  • Experience with front-end React/Jquery/Javascript, HTML and CSS Frameworks
  • APM tools - Worked on any tools such as Grafana, NR, Cloudwatch etc.,
  • Basic Understanding of AI models
  • Python

About Upland

Upland Software (Nasdaq: UPLD) helps global businesses accelerate digital transformation with a powerful cloud software library that provides choice, flexibility, and value. Upland India is a fully owned subsidiary of Upland Software and headquartered in Bangalore. We are a remote-first company. Interviews and on-boarding are conducted virtually.

Read more
venanalytics

at venanalytics

2 candid answers
Rincy jain
Posted by Rincy jain
Remote, Mumbai
3 - 4 yrs
₹7L - ₹10L / yr
skill iconPython
SQL
PowerBI
Client Servicing
Team Management
+6 more

About Ven Analytics


At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.


Role Overview


We’re looking for a Power BI Data Analyst who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL..


Key Responsibilities


  • Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.


  • Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.


  • Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.


  • Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.


  • Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.


  • Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.


  • Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.


  • Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.


  • Power BI Development: Use power BI desktop for report building and service for distribution 


  • Backend development: Develop optimized SQL queries that are easy to consume, maintain and debug.


  • Version Control: Strict control on versions by tracking CRs and Bugfixes. Ensuring the maintenance of Prod and Dev dashboards. 


  • Client Servicing : Engage with clients to understand their data needs, gather requirements, present insights, and ensure timely, clear communication throughout project cycles.


  • Team Management : Lead and mentor a small team by assigning tasks, reviewing work quality, guiding technical problem-solving, and ensuring timely delivery of dashboards and reports..


Must-Have Skills


  • Strong experience building robust data models in Power BI
  • Hands-on expertise with DAX (complex measures and calculated columns)
  • Proficiency in M Language (Power Query) beyond drag-and-drop UI
  • Clear understanding of data visualization best practices (less fluff, more insight)
  • Solid grasp of SQL and Python for data processing
  • Strong analytical thinking and ability to craft compelling data stories
  • Client Servicing Background.


Good-to-Have (Bonus Points)


  • Experience using DAX Studio and Tabular Editor
  • Prior work in a high-volume data processing production environment
  • Exposure to modern CI/CD practices or version control with BI tools

 

Why Join Ven Analytics?


  • Be part of a fast-growing startup that puts data at the heart of every decision.
  • Opportunity to work on high-impact, real-world business challenges.
  • Collaborative, transparent, and learning-oriented work environment.
  • Flexible work culture and focus on career development.


Read more
Ekloud INC
Kratika Agarwal
Posted by Kratika Agarwal
Remote only
8 - 14 yrs
₹7L - ₹18L / yr
m365
m365 developer
ms teams
MS SharePoint
Microsoft Exchange
+12 more

Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.

Need a strong understanding of the development lifecycle, and possess debugging skills time management, business acumen, and have a positive attitude is a must and open to continual growth.

Capability to code appropriate solutions will be tested in any interview.

Knowledge of a wide variety of Generative AI models

Conceptual understanding of how large language models work

Proficiency in coding languages for data manipulation (e.g., SQL) and machine learning & AI development (e.g., Python)

Experience with dashboarding tools such as Power BI and Tableau (beneficial but not essential)

Read more
Whiz IT Services
Sheeba Harish
Posted by Sheeba Harish
Remote only
10 - 15 yrs
₹20L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
API
Apache Kafka
+5 more

We are looking for highly experienced Senior Java Developers who can architect, design, and deliver high-performance enterprise applications using Spring Boot and Microservices . The role requires a strong understanding of distributed systems, scalability, and data consistency.

Read more
Forbes Advisor

at Forbes Advisor

3 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
4yrs+
Upto ₹35L / yr (Varies
)
skill iconPython
SQL
Database performance tuning
Data-flow analysis
Data modeling

About Forbes Advisor

Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.

We do this by combining data-driven content, rigorous product comparisons, and user-first design — all built on top of a modern, scalable platform. Our global teams bring deep expertise across journalism, product, performance marketing, data, and analytics.

 

The Role

We’re hiring a Data Scientist to help us unlock growth through advanced analytics and machine learning. This role sits at the intersection of marketing performance, product optimization, and decision science.


You’ll partner closely with Paid Media, Product, and Engineering to build models, generate insight, and influence how we acquire, retain, and monetize users. From campaign ROI to user segmentation and funnel optimization, your work will directly shape how we grow.This role is ideal for someone who thrives on business impact, communicates clearly, and wants to build re-usable, production-ready insights — not just run one-off analyses.

 

What You’ll Do

Marketing & Revenue Modelling

• Own end-to-end modelling of LTV, user segmentation, retention, and marketing

efficiency to inform media optimization and value attribution.

• Collaborate with Paid Media and RevOps to optimize SEM performance, predict high-

value cohorts, and power strategic bidding and targeting.

Product & Growth Analytics

• Work closely with Product Insights and General Managers (GMs) to define core metrics, KPIs, and success frameworks for new launches and features.

• Conduct deep-dive analysis of user behaviour, funnel performance, and product engagement to uncover actionable insights.

• Monitor and explain changes in key product metrics, identifying root causes and business impact.

• Work closely with Data Engineering to design and maintain scalable data pipelines that

support machine learning workflows, model retraining, and real-time inference.

Predictive Modelling & Machine Learning

• Build predictive models for conversion, churn, revenue, and engagement using regression, classification, or time-series approaches.

• Identify opportunities for prescriptive analytics and automation in key product and marketing workflows.

• Support development of reusable ML pipelines for production-scale use cases in product recommendation, lead scoring, and SEM planning.

Collaboration & Communication

• Present insights and recommendations to a variety of stakeholders — from ICs to executives — in a clear and compelling manner.

• Translate business needs into data problems, and complex findings into strategic action plans.

• Work cross-functionally with Engineering, Product, BI, and Marketing to deliver and deploy your work.

 

What You’ll Bring

Minimum Qualifications

• Bachelor’s degree in a quantitative field (Mathematics, Statistics, CS, Engineering, etc.).

• 4+ years in data science, growth analytics, or decision science roles.

• Strong SQL and Python skills (Pandas, Scikit-learn, NumPy).

• Hands-on experience with Tableau, Looker, or similar BI tools.

• Familiarity with LTV modelling, retention curves, cohort analysis, and media attribution.

• Experience with GA4, Google Ads, Meta, or other performance marketing platforms.

• Clear communication skills and a track record of turning data into decisions.


Nice to Have

• Experience with BigQuery and Google Cloud Platform (or equivalent).

• Familiarity with affiliate or lead-gen business models.

• Exposure to NLP, LLMs, embeddings, or agent-based analytics.

• Ability to contribute to model deployment workflows (e.g., using Vertex AI, Airflow, or Composer).

 

Why Join Us?

• Remote-first and flexible — work from anywhere in India with global exposure.

• Monthly long weekends (every third Friday off).

• Generous wellness stipends and parental leave.

• A collaborative team where your voice is heard and your work drives real impact.

• Opportunity to help shape the future of data science at one of the world’s most trusted

brands.

Read more
Tech AI startup in Bangalore

Tech AI startup in Bangalore

Agency job
via Recruit Square by Priyanka choudhary
Remote only
4 - 8 yrs
₹12L - ₹18L / yr
pandas
NumPy
MLOps
SQL
ETL
+1 more

Data Engineer – Validation & Quality


Responsibilities

  • Build rule-based and statistical validation frameworks using Pandas / NumPy.
  • Implement contradiction detection, reconciliation, and anomaly flagging.
  • Design and compute confidence metrics for each evidence record.
  • Automate schema compliance, sampling, and checksum verification across data sources.
  • Collaborate with the Kernel to embed validation results into every output artifact.

Requirements

  • 5 + years in data engineering, data quality, or MLOps validation.
  • Strong SQL optimization and ETL background.
  • Familiarity with data lineage, DQ frameworks, and regulatory standards (SOC 2 / GDPR).
Read more
Intineri infosol Pvt Ltd

at Intineri infosol Pvt Ltd

2 candid answers
Shivani Pandey
Posted by Shivani Pandey
Remote only
4 - 6 yrs
₹5L - ₹12L / yr
skill iconJavascript
Glide Script
JSON
SQL
ServiceNow
+3 more

Role Overview

 

We are seeking a ServiceNow Product Owner with deep expertise in ServiceNow modules (CSM, ITSM, HRSD)

and strong scripting and data-handling skills.

 

This role focuses on translating real enterprise workflows into structured, data-driven AI training tasks, helping improve reasoning and understanding within AI systems. It is not a platform configuration or app development role — instead, it blends functional ServiceNow knowledge, prompt engineering, and data design to build the next generation of intelligent enterprise models.

 

Key Responsibilities

 

·    Define decision frameworks and realistic scenarios for AI reinforcement learning based on ServiceNow workflows.

·    Design scenario-driven tasks mirroring ServiceNow processes like case handling, SLA tracking, and IT incident management.

·    Develop and validate structured data tasks in JSON, ensuring accuracy and clarity.

·    Write natural language instructions aligned with ServiceNow’s business logic and workflows.

·    Use SQL queries for validation and quality checks of task data.

·    Apply prompt engineering techniques to guide model reasoning.

·    Collaborate with peers to expand and document cross-domain scenarios (CSM, ITSM, HRSD).

·    Create and maintain documentation of scenario patterns and best practices.

 

Required Experience

 

·    4–6 years of experience with ServiceNow (CSM, ITSM, HRSD).

·    Deep understanding of cases, incidents, requests, SLAs, and knowledge management processes.

·    Proven ability to design realistic enterprise scenarios mapping to ServiceNow operations.

·    Exposure to AI model training workflows or structured data design is a plus.

 

Preferred Qualifications

 

·    ServiceNow Certified System Administrator (CSA)

·    ServiceNow Certified Implementation Specialist (CIS-ITSM / CSM / HRSD)

·    Exposure to AI/ML workflows or model training data preparation.

·    Excellent written and verbal communication skills, with client-facing 


Mandatory Skills: Scripting (Javascript, Glide Script), JSON Handling, SQL, Service Now Modules (ITSM, CSM, HRSD) and Prompt Engineering.

Read more
Remote only
2 - 4 yrs
₹4L - ₹8L / yr
skill iconPython
JSON
LLMS
oops
skill iconJava
+4 more

Role Overview

We are seeking a Junior Developer with 1-3 year’s experience with strong foundations in Python, databases, and AI technologies. The ideal candidate will support the development of AI-powered solutions, focusing on LLM integration, prompt engineering, and database-driven workflows. This is a hands-on role with opportunities to learn and grow into advanced AI engineering responsibilities.

Key Responsibilities

  • Develop, test, and maintain Python-based applications and APIs.
  • Design and optimize prompts for Large Language Models (LLMs) to improve accuracy and performance.
  • Work with JSON-based data structures for request/response handling.
  • Integrate and manage PostgreSQL (pgSQL) databases, including writing queries and handling data pipelines.
  • Collaborate with the product and AI teams to implement new features.
  • Debug, troubleshoot, and optimize performance of applications and workflows.
  • Stay updated on advancements in LLMs, AI frameworks, and generative AI tools.

Required Skills & Qualifications

  • Strong knowledge of Python (scripting, APIs, data handling).
  • Basic understanding of Large Language Models (LLMs) and prompt engineering techniques.
  • Experience with JSON data parsing and transformations.
  • Familiarity with PostgreSQL or other relational databases.
  • Ability to write clean, maintainable, and well-documented code.
  • Strong problem-solving skills and eagerness to learn.
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).

Nice-to-Have (Preferred)

  • Exposure to AI/ML frameworks (e.g., LangChain, Hugging Face, OpenAI APIs).
  • Experience working in startups or fast-paced environments.
  • Familiarity with version control (Git/GitHub) and cloud platforms (AWS, GCP, or Azure).

What We Offer

  • Opportunity to work on cutting-edge AI applications in permitting & compliance.
  • Collaborative, growth-focused, and innovation-driven work culture.
  • Mentorship and learning opportunities in AI/LLM development.
  • Competitive compensation with performance-based growth.


Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Remote only
8 - 12 yrs
₹20L - ₹26L / yr
skill icon.NET
ASP.NET
skill iconC#
skill iconAngular (2+)
skill iconJavascript
+7 more

We are seeking a highly skilled and experienced Senior Full Stack Developerwith 8+years of experience to join our dynamic team. The ideal candidate will have a strong background in both front-end and back-end development, with expertise in .NET, Angular, TypeScript, Azure, SQL Server, Agile methodologies, and Design Patterns. Experience with DocuSign is a plus.


Responsibilities:

  • Design, develop, and maintain web applications using .NET, Angular, and TypeScript.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement and maintain cloud-based solutions using Azure.
  • Develop and optimize SQL Server databases.
  • Follow Agile methodologies to manage project tasks and deliverables.
  • Apply design patterns and best practices to ensure high-quality, maintainable code.
  • Troubleshoot and resolve software defects and issues.
  • Mentor and guide junior developers.

Requirements:

  • Bachelor's degree in computer science, Engineering, or a related field.
  • Proven experience as a Full Stack Developer or similar role.
  • Strong proficiency in .NET, Angular, and TypeScript.
  • Experience with Azure cloud services.
  • Proficient in SQL Server and database design.
  • Familiarity with Agile methodologies and practices.
  • Solid understanding of design patterns and software architecture principles.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and teamwork abilities.
  • Experience with DocuSign is a plus.


Read more
Remote only
10 - 15 yrs
₹25L - ₹40L / yr
data engineer
Apache Spark
skill iconScala
Big Data
skill iconPython
+5 more

What You’ll Be Doing:

● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines

and platforms.

● Lead and mentor a team of data engineers while establishing engineering best practices,

coding standards, and governance models.

● Design and implement high-performance ETL/ELT pipelines using modern Big Data

technologies for diverse internal and external data sources.

● Drive modernization initiatives including re-architecting legacy systems to support

next-generation data products, ML workloads, and analytics use cases.

● Partner with Product, Engineering, and Business teams to translate requirements into

robust technical solutions that align with organizational priorities.

● Champion data quality, monitoring, metadata management, and observability across the

ecosystem.

● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and

infrastructure scalability.

● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,

and cloud-based architecture improvements.


Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 8+ years of relevant and recent experience in a Data Engineer role.

● 5+ years recent experience with Apache Spark and solid understanding of the

fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Demonstrated ability to design, review, and optimize scalable data architectures across

ingestion.

● Strong coding skills with Scala, Python and the ability to quickly switch between them with

ease.

● Advanced working SQL knowledge and experience working with a variety of relational

databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks.


● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,

and similar formats.

● Experience establishing and enforcing data engineering best practices, including CI/CD

for data, orchestration and automation, and metadata management.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Demonstrated ability to operate independently, take ownership of deliverables, and lead

technical decisions.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic

environment.

REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as

assigned by Management.

EMPLOYMENT TYPE: Full-Time, Permanent


SHIFT TIMINGS: 10:00 AM - 07:00 PM IST

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort