Cutshort logo
Remote sql jobs

50+ Remote SQL Jobs in India

Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
ByteFoundry AI

at ByteFoundry AI

4 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
3 - 8 yrs
Upto ₹40L / yr (Varies
)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
+3 more

About the Role

We are looking for a motivated Full Stack Developer with 2–5 years of hands-on experience in building scalable web applications. You will work closely with senior engineers and product teams to develop new features, improve system performance, and ensure high-

quality code delivery.

Responsibilities

- Develop and maintain full-stack applications.

- Implement clean, maintainable, and efficient code.

- Collaborate with designers, product managers, and backend engineers.

- Participate in code reviews and debugging.

- Work with REST APIs/GraphQL.

- Contribute to CI/CD pipelines.

- Ability to work independently as well as within a collaborative team environment.


Required Technical Skills

- Strong knowledge of JavaScript/TypeScript.

- Experience with React.js, Next.js.

- Backend experience with Node.js, Express, NestJS.

- Understanding of SQL/NoSQL databases.

- Experience with Git, APIs, debugging tools.ß

- Cloud familiarity (AWS/GCP/Azure).

AI and System Mindset

Experience working with AI-powered systems is a strong plus. Candidates should be comfortable integrating AI agents, third-party APIs, and automation workflows into applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.

Soft Skills

- Strong problem-solving ability.

- Good communication and teamwork.

- Fast learner and adaptable.

Education

Bachelor's degree in Computer Science / Engineering or equivalent.

Read more
Sun King

at Sun King

2 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
2yrs+
Best in industry
Test Automation (QA)
Software Testing (QA)
Manual testing
skill iconPython
skill iconJava
+8 more

About Sun King

Sun King is the world’s leading off-grid solar energy company, providing affordable solar solutions to the 1.8 billion people without reliable access to electricity. By combining product design, fintech, and field operations, Sun King has connected over 20 million homes to solar power across Africa and Asia, adding more than 200,000 new homes each month. Through ‘pay-as-you-go’ financing, customers make small payments to eventually own their solar systems, saving money and reducing reliance on harmful energy sources like kerosene.


Sun King employs 2,800 staff across 12 countries, with expertise in product design, data science, logistics, customer service, and more. The company is expanding its product range to include clean cooking, electric mobility, and entertainment solutions, all while supporting a diverse workforce — with women making up 44% of the team.


About the role:

The role involves designing, executing, and maintaining robust functional, regression, and integration testing to ensure product quality and reliability, along with thorough defect tracking, analysis, and resolution. The individual will develop and maintain UI and API automation frameworks to improve test coverage, minimize manual effort, and enhance release efficiency. Close collaboration with development teams is expected to reproduce issues, validate fixes, and ensure high-quality releases. The role also includes integrating automated tests into CI/CD pipelines, supporting production issue analysis, and verifying hotfixes in live environments. Additionally, the candidate will actively participate in requirement and design reviews to ensure testability and clarity, maintain comprehensive QA documentation, and continuously improve testing frameworks, tools, and overall QA processes.


What you will be expected to do:

  • Design, execute, and maintain test cases, test plans, and test scripts for functional, regression, and integration testing.
  • Identify software defects, document them clearly, and track them through to closure.
  • Analyze bugs and provide detailed insights to help developers understand root causes.
  • Partner closely with the development team to reproduce issues, validate fixes, and ensure overall product quality.
  • Develop, maintain, and improve automated test suites (API/UI) to enhance test coverage, reduce manual effort, and improve release confidence.
  • Work with CI/CD pipelines to integrate automated tests into the deployment workflow.
  • Validate production issues, support troubleshooting, and verify hotfixes in real-time environments.
  • Recommend improvements in product performance, usability, and reliability based on test findings.
  • Participate in requirement and design reviews to ensure clarity, completeness, and testability.
  • Benchmark against competitor products and suggest enhancements based on industry trends.
  • Maintain detailed test documentation, including test results, defect logs, and release readiness assessments.
  • Continuously improve QA processes, automation frameworks, and testing methodologies.

You might be a strong candidate if you have/are:

  • Bachelor’s Degree in Computer Science, Information Technology, or a related field.
  • 2+ years of hands-on experience in software testing (manual + exposure to automation).
  • Strong understanding of QA methodologies, testing types, and best practices.
  • Experience in designing and executing test cases, test plans, and regression suites.
  • Exposure to automation tools/frameworks such as Selenium, Playwright, Cypress, TestNG, JUnit, or similar.
  • Basic programming or scripting knowledge (Java/Python preferred).
  • Good understanding of SQL for backend and data validation testing.
  • Familiarity with API testing tools such as Postman or RestAssured.
  • Experience with defect tracking and test management tools (Jira, TestRail, etc.).
  • Strong analytical and debugging skills with the ability to identify root causes.
  • Ability to work effectively in Agile/Scrum environments and partner with developers, product, and DevOps teams.
  • Strong ownership mindset — having contributed to high-quality, near bug-free releases.

Good to have:

  • Exceptional attention to detail and a strong focus on product quality.
  • Experience with performance, load, or security testing (JMeter, Gatling, OWASP tools, etc.).
  • Exposure to advanced automation frameworks or building automation scripts from scratch.
  • Familiarity with CI/CD pipelines and integrating automated tests.
  • Experience working with observability tools like Grafana, Kibana, and Prometheus for production verification.
  • Good understanding of microservices, distributed systems, or cloud platforms.

What Sun King offers:

  • Professional growth in a dynamic, rapidly expanding, high-social-impact industry
  • An open-minded, collaborative culture made up of enthusiastic colleagues who are driven by the challenge of innovation towards profound impact on people and the planet.
  • A truly multicultural experience: you will have the chance to work with and learn from people from different geographies, nationalities, and backgrounds.
  • Structured, tailored learning and development programs that help you become a better leader, manager, and professional through the Sun King Center for Leadership.
Read more
-
Remote only
8 - 13 yrs
₹10L - ₹33L / yr
python
PySpark
Big Data
SQL

Role: Lead Data Engineer Core

Responsibilities: Lead end-to-end design, development, and delivery of complex cloud-based data pipelines.

Collaborate with architects and stakeholders to translate business requirements into technical data solutions.

Ensure scalability, reliability, and performance of data systems across environments. Provide mentorship and technical leadership to data engineering teams. Define and enforce best practices for data modeling, transformation, and governance.


Optimize data ingestion and transformation frameworks for efficiency and cost management. Contribute to data architecture design and review sessions across projects.


Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

8+ years of experience in data engineering with proven leadership in designing cloud native data systems.


Strong expertise in Python, SQL, Apache Spark, and at least one cloud platform (Azure, AWS, or GCP). Experience with Big Data, DataLake, DeltaLake, and Lakehouse architectures Proficient in one or more database technologies (e.g. PostgreSQL, Redshift, Snowflake, and NoSQL databases).


Ability to recommend and implement scalable data pipelines Preferred Qualifications: Cloud certification (AWS, Azure, or GCP). Experience with Databricks, Snowflake, or Terraform. Familiarity with data governance, lineage, and observability tools. Strong collaboration skills and ability to influence data-driven decisions across teams.

Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
Remote only
8 - 15 yrs
₹7L - ₹30L / yr
java
Fullstack Developer
skill iconAngular (2+)
skill iconSpring Boot
SQL
+2 more

Java Angular Fullstack Developer

 

Job Description:


Technical Lead – Full Stack

Experience: 8–12 years (Strong candidates Java 50% - Angular 50%)

Location – remote 

Pf no is mandatory 



Tech Stack: Java, Spring Boot, Microservices, Angular, SQL

Focus: Hands-on coding, solution design, team leadership, delivery ownership

 

Must-Have Skills (Depth)



Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.

Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.

Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.

React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).

SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.

Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.

DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.

Read more
Remote only
5 - 10 yrs
₹25L - ₹55L / yr
Data engineering
Databases
skill iconPython
SQL
skill iconPostgreSQL
+4 more

Role: Full-Time, Long-Term Required: Python, SQL Preferred: Experience with financial or crypto data


OVERVIEW

We are seeking a data engineer to join as a core member of our technical team. This is a long-term position for someone who wants to build robust, production-grade data infrastructure and grow with a small, focused team. You will own the data layer that feeds our machine learning pipeline—from ingestion and validation through transformation, storage, and delivery.


The ideal candidate is meticulous about data quality, thinks deeply about failure modes, and builds systems that run reliably without constant attention. You understand that downstream ML models are only as good as the data they consume.


CORE TECHNICAL REQUIREMENTS

Python (Required): Professional-level proficiency. You write clean, maintainable code for data pipelines—not throwaway scripts. Comfortable with Pandas, NumPy, and their performance characteristics. You know when to use Python versus push computation to the database.


SQL (Required): Advanced SQL skills. Complex queries, query optimization, schema design, execution plans. PostgreSQL experience strongly preferred. You think about indexing, partitioning, and query performance as second nature.


Data Pipeline Design (Required): You build pipelines that handle real-world messiness gracefully. You understand idempotency, exactly-once semantics, backfill strategies, and incremental versus full recomputation tradeoffs. You design for failure—what happens when an upstream source is late, returns malformed data, or goes down entirely. Experience with workflow orchestration required: Airflow, Prefect, Dagster, or similar.


Data Quality (Required): You treat data quality as a first-class concern. You implement validation checks, anomaly detection, and monitoring. You know the difference between data that is missing versus data that should not exist. You build systems that catch problems before they propagate downstream.


WHAT YOU WILL BUILD

Data Ingestion: Pipelines pulling from diverse sources—crypto exchanges, traditional market feeds, on-chain data, alternative data. Handling rate limits, API quirks, authentication, and source-specific idiosyncrasies.


Data Validation: Checks ensuring completeness, consistency, and correctness. Schema validation, range checks, freshness monitoring, cross-source reconciliation.


Transformation Layer: Converting raw data into clean, analysis-ready formats. Time series alignment, handling different frequencies and timezones, managing gaps.


Storage and Access: Schema design optimized for both write patterns (ingestion) and read patterns (ML training, feature computation). Data lifecycle and retention management.

Monitoring and Alerting: Observability into pipeline health. Knowing when something breaks before it affects downstream systems.


DOMAIN EXPERIENCE

Preference for candidates with experience in financial or crypto data—understanding market data conventions, exchange-specific quirks, and point-in-time correctness. You know why look-ahead bias is dangerous and how to prevent it.


Time series data at scale—hundreds of symbols with years of history, multiple frequencies, derived features. You understand temporal joins, windowed computations, and time-aligned data challenges.


High-dimensional feature stores—we work with hundreds of thousands of derived features. Experience managing, versioning, and serving large feature sets is valuable.


ENGINEERING STANDARDS

Reliability: Pipelines run unattended. Failures are graceful with clear errors, not silent corruption. Recovery is straightforward.


Reproducibility: Same inputs and code version produce identical outputs. You version schemas, track lineage, and can reconstruct historical states.


Documentation: Schemas, data dictionaries, pipeline dependencies, operational runbooks. Others can understand and maintain your systems.


Testing: You write tests for pipelines—validation logic, transformation correctness, edge cases. Untested pipelines are broken pipelines waiting to happen.


TECHNICAL ENVIRONMENT

PostgreSQL, Python, workflow orchestration (flexible on tool), cloud infrastructure (GCP preferred but flexible), Git.


WHAT WE ARE LOOKING FOR

Attention to Detail: You notice when something is slightly off and investigate rather than ignore.


Defensive Thinking: You assume sources will send bad data, APIs will fail, schemas will change. You build accordingly.


Self-Direction: You identify problems, propose solutions, and execute without waiting to be told.


Long-Term Orientation: You build systems you will maintain for years.


Communication: You document clearly, explain data issues to non-engineers, and surface problems early.


EDUCATION

University degree in a quantitative/technical field preferred: Computer Science, Mathematics, Statistics, Engineering. Equivalent demonstrated expertise also considered.


TO APPLY

Include: (1) CV/resume, (2) Brief description of a data pipeline you built and maintained, (3) Links to relevant work if available, (4) Availability and timezone.

Read more
Analytical Brains Education
Remote only
1 - 5 yrs
₹8L - ₹12L / yr
skill iconPython
Shell Scripting
Powershell
SQL
skill iconJava

Job Description

We are looking for motivated IT professionals with at least one year of industry experience. The ideal candidate should have hands-on experience in AWS, Azure, AI, or Cloud technologies, or should be enthusiastic and ready to upskill and shift to new and emerging technologies. This role is primarily remote; however, candidates may be required to visit the office occasionally for meetings or project needs.

Key Requirements

  • Minimum 1 year of experience in the IT industry
  • Exposure to AWS / Azure / AI / Cloud platforms (any one or more)
  • Willingness to learn and adapt to new technologies
  • Strong problem-solving and communication skills
  • Ability to work independently in a remote setup
  • Must have a proper work-from-home environment (laptop, stable internet, quiet workspace)

Education Qualification

  • B.Tech / BE / MCA / M.Sc (IT) / equivalent


Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
3yrs+
Upto ₹15L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
Selenium
Appium
cypress
+3 more

Quality Engineer is responsible for planning, developing, and executing tests for CFRA’s financial software. The responsibilities include designing and implementing tests, debugging and defining corrective actions. The role plays an important part in our company’s product development process. Our ideal candidate will be responsible for conducting tests to ensure software runs efficiently and meets client needs, while at the same time being cost-effective. You will be part of CFRA Data Collection Team responsible for collecting, processing and publishing financial market data for internal and external stakeholders. The team uses a contemporary stack in the AWS Cloud to design, build and maintain a robust data architecture, data engineering pipelines, and large-scale data systems. You will be responsible for verifying and validating all data quality and completeness parameters for the automated (ETL) pipeline processes (new and existing).

Key Responsibilities

  • Review requirements, specifications and technical design documents to provide timely and meaningful feedback
  • Create detailed, comprehensive and well-structured test plans and test cases
  • Estimate, prioritize, plan and coordinate testing activities
  • Identify, record, document thoroughly and track bugs
  • Develop and apply testing processes for new and existing products to meet client needs
  • Liaise with internal teams to identify system requirements and develop testing plans
  • Investigate the causes of non-conforming software and train users to implement solutions
  • Stay up-to-date with new testing tools and test strategies

Desired Skills

  • Proven work experience in software development and quality assurance
  • Strong knowledge of software QA methodologies, tools and processes
  • Experience in writing clear, concise and comprehensive test plans and test cases
  • Hands-on experience with automated testing tools
  • Acute attention to detail
  • Experience working in an Agile/Scrum development process
  • Excellent collaboration skills

 Technical Skills

  • Proficient with SQL, and capable of developing queries for testing
  • Familiarity with Python, especially for scripting tests
  • Familiarity with Cloud Technology and working with remote servers


Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
4yrs+
Upto ₹23L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
skill iconNodeJS (Node.js)
skill iconJava
+1 more

The Senior Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.

The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.


Key Responsibilities

  • Analyst Workflows: Design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
  • Designing and Developing APIs: Design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
  • AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
  • Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
  • Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
  • Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
  • Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
  • Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
  • Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
  • Problem Solving: troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
  • Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.

Desired Skills and Experience

  • Development: 5+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
  • AWS Services: proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
  • Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
  • Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
  • Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
  • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
  • Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
  • Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
  • Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
  • Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
  • Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.




Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
7yrs+
Upto ₹36L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
skill iconNodeJS (Node.js)
skill iconJava

The Lead Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.

The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.


Key Responsibilities

  • Analyst Workflows: Lead the design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
  • Designing and Developing APIs: Lead the design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
  • Architecture Planning: Collaborate with architects and stakeholders to define architecture, including API gateway, microservices, and serverless components, ensuring alignment with business goals and AWS best practices.
  • Technical Leadership: Provide technical guidance and leadership to the development team, ensuring adherence to coding standards, best practices, and AWS guidelines.
  • AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
  • Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
  • Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
  • Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
  • Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
  • Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
  • Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
  • Problem Solving: Lead troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
  • Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.


Desired Skills and Experience

  • Development: 10+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
  • AWS Services: Strong proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
  • Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
  • Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
  • Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
  • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
  • Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
  • Team Leadership: Experience leading and mentoring a team of developers, providing technical guidance, code reviews, and fostering a collaborative and innovative environment.
  • Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
  • Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
  • Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
  • Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.
Read more
CSI Interfusion
Sujitha Kotipalli
Posted by Sujitha Kotipalli
Remote, Hyderabad
5 - 10 yrs
₹35L - ₹45L / yr
skill iconReact.js
skill iconC#
skill icon.NET
SQL
Microsoft Windows Azure

1、Job Responsibilities:

Backend Development (.NET)

  • Design and implement ASP.NET Core WebAPIs
  • Design and implement background jobs using Azure Function Apps
  • Optimize performance for long-running operations, ensuring high concurrency and system stability.
  • Develop efficient and scalable task scheduling solutions to execute periodic tasks

Frontend Development (React)

  • Build high-performance, maintainable React applications and optimize component rendering.
  • Continuously improving front-end performance using best practices
  • Deployment & Operations
  • Deploy React applications on Azure platforms (Azure Web Apps), ensuring smooth and reliable delivery.
  • Collaborate with DevOps teams to enhance CI/CD pipelines and improve deployment efficiency.

2、Job Requirements:

Tech Stack:

  • Backend: ASP.NET Core Web API, C#
  • Frontend: React, JavaScript/TypeScript, Redux or other state management libraries
  • Azure: Function Apps, Web Apps, Logic Apps
  • Database: Cosmos DB, SQL Server

Strong knowledge of asynchronous programmingperformance optimization, and task scheduling

  • Proficiency in React performance optimization techniques, understanding of virtual DOM and component lifecycle.​
  • Experience with cloud deployment, preferably Azure App Service or Azure Static Web Apps.​
  • Familiarity with Git and CI/CD workflows, with strong coding standards.

3、Project Background:

Mission: Transform Microsoft Cloud customers into fans by delivering exceptional support and engagement.​

  • Scope:
  • Customer reliability engineering
  • Advanced cloud engineering and supportability
  • Business management and operations
  • Product and platform orchestration​
  • Activities:
  • Technical skilling programs
  • AI strategy for customer experience
  • Handling escalations and service reliability issues​

4、Project Highlights:

React Js, ASP.NET Core Web API; Azure Function Apps, Cosmos DB

 

Read more
Appler
Appler Solutions
Posted by Appler Solutions
Remote only
4 - 6 yrs
₹7L - ₹10L / yr
skill iconJavascript
skill iconReact Native
skill iconReact.js
skill iconNodeJS (Node.js)
SQL

Job Title: Sr. Frontend Developer (Javascript)

Location: Remote Only

Experience Required: 4–6 years

Salary Range: 7L – 10L per year

About the Role:

We are looking for an experienced Sr. Frontend Developer with strong expertise in Javascript to join our remote team. The ideal candidate will have 4–6 years of hands-on experience in frontend development, with a focus on building responsive, high-performance web applications. You will work closely with cross-functional teams to design, develop, and implement user-facing features that align with business goals and enhance user experience.

Key Responsibilities:

  • Develop and maintain scalable, reusable frontend components and applications using modern Javascript frameworks and libraries.
  • Collaborate with UI/UX designers, product managers, and backend developers to deliver seamless user experiences.
  • Optimize applications for maximum speed, scalability, and accessibility.
  • Write clean, modular, and well-documented code following best practices.
  • Participate in code reviews, sprint planning, and agile development processes.
  • Troubleshoot, debug, and resolve frontend-related issues.
  • Stay updated with emerging frontend technologies and industry trends.

Must-Have Skills:

  • Javascript (ES6+)
  • React.js
  • React Native
  • NodeJS (Node.js)
  • SQL

Nice-to-Have Skills:

  • Experience with state management libraries (Redux, Context API, etc.)
  • Familiarity with testing frameworks (Jest, Cypress, React Testing Library)
  • Knowledge of frontend build tools (Webpack, Babel, NPM/Yarn)
  • Understanding of RESTful APIs and GraphQL
  • Experience with version control systems (Git)
  • Familiarity with CI/CD pipelines and deployment processes

Qualifications:

  • 4–6 years of professional frontend development experience.
  • Proven track record of delivering high-quality, production-ready applications.
  • Strong understanding of responsive design, cross-browser compatibility, and web performance optimization.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work independently in a remote environment and communicate effectively with distributed teams.

What We Offer:

  • Competitive salary within the range of 7L – 10L per year.
  • Fully remote work flexibility.
  • Opportunity to work on innovative projects with a talented and supportive team.
  • Professional growth and skill development opportunities.


Read more
Vy Systems

at Vy Systems

1 recruiter
Kalki K
Posted by Kalki K
Remote only
4 - 12 yrs
₹18L - ₹28L / yr
databricks
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
PySpark

Job Summary


We are seeking an experienced Databricks Developer with strong skills in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred), Azure. The role involves designing, developing, and optimizing scalable data pipelines and analytics workflows on the Databricks platform.


Key Responsibilities

- Develop and optimize ETL/ELT pipelines using Databricks and PySpark.

- Build scalable data workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse).

- Implement and manage Delta Lake (ACID, schema evolution, time travel).

- Write efficient, complex SQL for transformation and analytics.

- Build and support batch and streaming ingestion (Kafka, Kinesis, EventHub).

- Optimize Databricks clusters, jobs, notebooks, and PySpark performance.

- Collaborate with cross-functional teams to deliver reliable data solutions.

- Ensure data governance, security, and compliance.

- Troubleshoot pipelines and support CI/CD deployments.


Required Skills & Experience

- 4–8 years in Data Engineering / Big Data development.

- Strong hands-on experience with Databricks (clusters, jobs, workflows).

- Advanced PySpark and strong Python skills.

- Expert-level SQL (complex queries, window functions).

- Practical experience with AWS (preferred) or Azure cloud services.

- Experience with Delta Lake, Parquet, and data lake architectures.

- Familiarity with CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).

- Good understanding of data modeling, optimization, and distributed systems.

Read more
Fountane inc
Remote only
2 - 4 yrs
₹10L - ₹18L / yr
Firebase
skill iconMongoDB
skill iconExpress
skill iconNodeJS (Node.js)
SQL
+8 more

JOB TITLE: Associate Full Stack Developer (SDE-2)

 

LOCATION: Remote/Hybrid.

 Associate Full Stack Developer (SDE-2)

 

A LITTLE BIT ABOUT THE ROLE:

 

As a Full Stack Developer, you will be responsible for developing digital systems that deliver optimal end-to-end solutions to our business needs. The work will cover all aspects of software delivery, including working with staff, vendors, and outsourced contributors to build, release and maintain the product.

 

Fountane operates a scrum-based Agile delivery cycle, and you will be working within this. You will work with product owners, user experience, test, infrastructure, and operations professionals to build the most effective solutions.

 

WHAT YOU WILL BE DOING:

 

  • Full-stack development on a multinational team on various products across different technologies and industries.
  • Optimize the development process and identify continuing improvements.
  • Monitor technology landscape, assess and introduce new technology. Own and communicate development processes and standards.
  • The job title does not define or limit your duties, and you may be required to carry out other work within your abilities from time to time at our request. We reserve the right to introduce changes in line with technological developments which may impact your job duties or methods of working.

 

 

WHAT YOU WILL NEED TO BE GREAT IN THIS ROLE:

 

  • Minimum of 2+ years of full-stack development, combined back and front-end experience building fast, reliable web and/or mobile applications.
  • Experience with Web frameworks (e.g., React, Angular or Vue) and/or mobile development (e.g., Native, Native Script, React)
  • Proficient in at least one JavaScript framework such as React, NodeJs, AngularJS (2. x), or jQuery.
  • Ability to optimize product development by leveraging software development processes.
  • Bachelor's degree or equivalent (minimum six years) of work experience. If you have an Associate’s Degree must have a minimum of 4 years of work experience.
  • Fountane's current technology stack driving our digital products includes React.js, Node.js, React Native, Angular, Firebase, Bootstrap, MongoDB, Express, Hasura, GraphQl, Amazon Web Services(AWS), and Google Cloud Platform.

 

SOFT SKILLS:

 

  • Collaboration - Ability to work in teams across the world
  • Adaptability - situations are unexpected, and you need to be quick to adapt
  • Open-mindedness - Expect to see things outside the ordinary

 

LIFE AT FOUNTANE:

 

  • Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
  • Competitive pay
  • Health insurance
  • Individual/team bonuses
  • Employee stock ownership plan
  • Fun/challenging variety of projects/industries
  • Flexible workplace policy - remote/physical
  • Flat organization - no micromanagement
  • Individual contribution - set your deadlines
  • Above all - culture that helps you grow exponentially.


Qualifications - No bachelor's degree required. Good communication skills are a must!


A LITTLE BIT ABOUT THE COMPANY:

Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.

We’re a team of 80 strong from around the world that are radically open-minded and believes in excellence, respecting one

Read more
Remote only
5 - 15 yrs
₹10L - ₹15L / yr
FastAPI
skill iconPython
RESTful APIs
SQL
NOSQL Databases
+5 more


Summary:

We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.

Job Type:

Full-Time Contractor (12 months)

Location:

Remote / On-site (Jaipur preferred, as per project needs)

Experience:

5+ years in backend development

Key Responsibilities:

  • Design, develop, and maintain robust backend services using Python and FastAPI.
  •  Implement and manage Prisma ORM for database operations.
  • Build scalable APIs and integrate with SQL databases and third-party services.
  • Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
  • Collaborate with front-end developers and other team members to deliver high-quality web applications.
  • Ensure application performance, security, and reliability.
  • Participate in code reviews, testing, and deployment processes.

Required Skills:

  • Expertise in Python backend development with strong experience in FastAPI.
  • Solid understanding of RESTful API design and implementation.
  • Proficiency in SQL databases and ORM tools (preferably Prisma)
  • Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
  • Familiarity with CI/CD pipelines and containerization (Docker).
  • Knowledge of cloud architecture best practices.

Added Advantage:

  • Front-end development knowledge (React, Angular, or similar frameworks).
  • Exposure to AWS/GCP cloud platforms.
  • Experience with NoSQL databases.

Eligibility:

  • Minimum 5 years of professional experience in backend development.
  • Available for full-time engagement.
  • Please excuse if you are currently engaged in other projects—we require dedicated availability.

 

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Remote only
5 - 7 yrs
₹10L - ₹25L / yr
Windows Azure
Data engineering
SQL
CI/CD
databricks

Role: Senior Data Engineer (Azure)

Experience: 5+ Years

Location: Anywhere in india

Work Mode: Remote

Notice Period - Immediate joiners or Serving notice period

𝐊𝐞𝐲 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:

  • Data processing on Azure using ADF, Streaming Analytics, Event Hubs, Azure Databricks, Data Migration Services, and Data Pipelines
  • Provisioning, configuring, and developing Azure solutions (ADB, ADF, ADW, etc.)
  • Designing and implementing scalable data models and migration strategies
  • Working on distributed big data batch or streaming pipelines (Kafka or similar)
  • Developing data integration & transformation solutions for structured and unstructured data
  • Collaborating with cross-functional teams for performance tuning and optimization
  • Monitoring data workflows and ensuring compliance with governance and quality standards
  • Driving continuous improvement through automation and DevOps practices

𝐌𝐚𝐧𝐝𝐚𝐭𝐨𝐫𝐲 𝐒𝐤𝐢𝐥𝐥𝐬 & 𝐄𝐱𝐩𝐞𝐫𝐢𝐞𝐧𝐜𝐞:

  • 5–10 years of experience as a Data Engineer
  • Strong proficiency in Azure Databricks, PySpark, Python, SQL, and Azure Data Factory
  • Experience in Data Modelling, Data Migration, and Data Warehousing
  • Good understanding of database structure principles and schema design
  • Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms
  • Experience with DevOps tools (Azure DevOps, Jenkins, Airflow, Azure Monitor) — good to have
  • Knowledge of distributed data processing and real-time streaming (Kafka/Event Hub)
  • Familiarity with visualization tools like Power BI or Tableau
  • Strong analytical, problem-solving, and debugging skills
  • Self-motivated, detail-oriented, and capable of managing priorities effectively


Read more
Hypersonix Inc

at Hypersonix Inc

2 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote only
4yrs+
Upto ₹30L / yr (Varies
)
SQL
skill iconPython
Data engineering
Big Data
skill iconAmazon Web Services (AWS)
+1 more

About the Company

Hypersonix.ai is disrupting the e-commerce space with AI, ML and advanced decision capabilities to drive real-time business insights. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in various industry verticals. Hypersonix.ai is seeking a well-rounded, hands-on product leader to help lead product management of key capabilities and features.


About the Role

We are looking for talented and driven Data Engineers at various levels to work with customers to build the data warehouse, analytical dashboards and ML capabilities as per customer needs.


Roles and Responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements; should write complex queries in an optimized way
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Run ad-hoc analysis utilizing the data pipeline to provide actionable insights
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
  • Work with analytics and data scientist team members and assist them in building and optimizing our product into an innovative industry leader


Requirements

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • A successful history of manipulating, processing and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
  • We are looking for a candidate with 4+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Information Technology or completed MCA.
Read more
Appiness Interactive
Remote only
6 - 10 yrs
₹10L - ₹14L / yr
skill iconPython
skill iconDjango
FastAPI
skill iconFlask
pandas
+9 more

Position Overview: The Lead Software Architect - Python & Data Engineering is a senior technical leadership role responsible for designing and owning end-to-end architecture for data-intensive, AI/ML, and analytics platforms, while mentoring developers and ensuring technical excellence across the organization. 


Key Responsibilities: 

  • Design end-to-end software architecture for data-intensive applications, AI/ML pipelines, and analytics platforms
  • Evaluate trade-offs between competing technical approaches 
  • Define data models, API approach, and integration patterns across systems 
  • Create technical specifications and architecture documentation 
  • Lead by example through production-grade Python code and mentor developers on engineering fundamentals 
  • Conduct design and code reviews focused on architectural soundness 
  • Establish engineering standards, coding practices, and design patterns for the team 
  • Translate business requirements into technical architecture 
  • Collaborate with data scientists, analysts, and other teams to design integrated solutions 
  • Whiteboard and defend system design and architectural choices 
  • Take responsibility for system performance, reliability, and maintainability 
  • Identify and resolve architectural bottlenecks proactively 


Required Skills:  

  • 8+ years of experience in software architecture and development  
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 
  • Strong foundations in data structures, algorithms, and computational complexity 
  • Experience in system design for scale, including caching strategies, load balancing, and asynchronous processing  
  • 6+ years of Python development experience 
  • Deep knowledge of Django, Flask, or FastAPI 
  • Expert understanding of Python internals including GIL and memory management 
  • Experience with RESTful API design and event-driven architectures (Kafka, RabbitMQ) 
  • Proficiency in data processing frameworks such as Pandas, Apache Spark, and Airflow 
  • Strong SQL optimization and database design experience (PostgreSQL, MySQL, MongoDB)  Experience with AWS, GCP, or Azure cloud platforms 
  • Knowledge of containerization (Docker) and orchestration (Kubernetes) 
  • Hands-on experience designing CI/CD pipelines Preferred (Bonus) 


Skills

  • Experience deploying ML models to production (MLOps, model serving, monitoring) Understanding of ML system design including feature stores and model versioning 
  • Familiarity with ML frameworks such as scikit-learn, TensorFlow, and PyTorch  
  • Open-source contributions or technical blogging demonstrating architectural depth 
  • Experience with modern front-end frameworks for full-stack perspective


Read more
CT Nova
Apurv M
Posted by Apurv M
Remote only
3 - 15 yrs
₹25L - ₹50L / yr
skill icon.NET
Windows Azure
SQL
skill iconReact.js
Microservices

Experience: 3+ years (Backend/Full-Stack)


Note: You will be the 3rd engineer on the team. If you are comfortable with Java and Springboot plus Cloud, then you will easily be able to pick up the following stack.


Key Requirements —

  • Primary Stack: Experience with .NET
  • Cloud: Solid understanding of cloud platforms (preferably Azure)
  • Frontend/DevOps: Familiarity with React and DevOps practices
  • Architecture: Strong grasp of microservices
  • Technical Skills: Basic proficiency in scripting, databases, and Git


Compensation: competitive salary, based on experience and fit

Read more
Ekloud INC
Kratika Agarwal
Posted by Kratika Agarwal
Remote only
8 - 14 yrs
₹6L - ₹14L / yr
m365
MS SharePoint
sharepoint online
ms team
exchange online
+5 more

Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.

Read more
Oddr Inc
Deepika Madgunki
Posted by Deepika Madgunki
Remote only
2 - 5 yrs
₹5L - ₹15L / yr
BOOMI
iPaaS
SQL
Microsoft Windows Azure
RESTful APIs
+1 more

- Design and implement integration solutions using iPaaS tools.

- Collaborate with customers, product, engineering and business stakeholders to translate business requirements into robust and scalable integration processes.

- Develop and maintain SQL queries and scripts to facilitate data manipulation and integration.

- Utilize RESTful API design and consumption to ensure seamless data flow between various systems and applications.

- Lead the configuration, deployment, and ongoing management of integration projects.

- Troubleshoot and resolve technical issues related to integration solutions.

- Document integration processes and create user guides for internal and external users.

- Stay current with the latest developments in iPaaS technologies and best practices


Qualifications:

- Bachelor’s degree in Computer Science, Information Technology, or a related field.

- Minimum of 3 years’ experience in an integration engineering role with hands-on experience in an iPaaS tool, preferably Boomi.

- Proficiency in SQL and experience with database management and data integration patterns. - Strong understanding of integration patterns and solutions, API design, and cloud-based technologies.

- Good understanding of RESTful APIs and integration.

- Excellent problem-solving and analytical skills.

- Strong communication and interpersonal skills, with the ability to work effectively in a team environment.

- Experience with various integration protocols (REST, SOAP, FTP, etc.) and data formats (JSON, XML, etc.).


Preferred Skills:

- Boomi (or other iPaaS) certifications

- Experience with Intapp's Integration Builder is highly desirable but not mandatory.

- Certifications in Boomi or similar integration platforms.

- Experience with cloud services like MS Azure.

- Knowledge of additional programming languages (e.g., .NET, Java) is advantageous.


What we offer:

- Competitive salary and benefits package.

- Dynamic and innovative work environment.

- Opportunities for professional growth and advancement.


Read more
Ekloud INC
Seema KK
Posted by Seema KK
Remote only
8 - 12 yrs
₹23L - ₹25L / yr
skill iconJava
skill iconSpring Boot
Microservices
skill iconReact.js
SQL
+9 more

Job Description:

Technical Lead – Full Stack

Experience: 8–12 years (Strong candidates Java 50% - React 50%)

Location – Bangalore/Hyderabad

Interview Levels – 3 Rounds

Tech Stack: Java, Spring Boot, Microservices, React, SQL

Focus: Hands-on coding, solution design, team leadership, delivery ownership

 

Must-Have Skills (Depth)

Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.

Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.

Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.

React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).

SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.

Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.

DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.

Read more
Neuvamacro Technology Pvt Ltd
Remote only
5 - 10 yrs
₹13L - ₹18L / yr
PowerBI
Office 365
Microsoft Dynamics
skill iconAmazon Web Services (AWS)
skill iconJavascript
+10 more

We are seeking a highly skilled Power Platform Developer with deep expertise in designing, developing, and deploying solutions using Microsoft Power Platform. The ideal candidate will have strong knowledge of Power Apps, Power Automate, Power BI, Power Pages, and Dataverse, along with integration capabilities across Microsoft 365, Azure, and third-party systems.


Key Responsibilities

  • Solution Development:
  • Design and build custom applications using Power Apps (Canvas & Model-Driven).
  • Develop automated workflows using Power Automate for business process optimization.
  • Create interactive dashboards and reports using Power BI for data visualization and analytics.
  • Configure and manage Dataverse for secure data storage and modelling.
  • Develop and maintain Power Pages for external-facing portals.
  • Integration & Customization:
  • Integrate Power Platform solutions with Microsoft 365, Dynamics 365, Azure services, and external APIs.
  • Implement custom connectors and leverage Power Platform SDK for advanced scenarios.
  • Utilize Azure Functions, Logic Apps, and REST APIs for extended functionality.
  • Governance & Security:
  • Apply best practices for environment management, ALM (Application Lifecycle Management), and solution deployment.
  • Ensure compliance with security, data governance, and licensing guidelines.
  • Implement role-based access control and manage user permissions.
  • Performance & Optimization:
  • Monitor and optimize app performance, workflow efficiency, and data refresh strategies.
  • Troubleshoot and resolve technical issues promptly.
  • Collaboration & Documentation:
  • Work closely with business stakeholders to gather requirements and translate them into technical solutions.
  • Document architecture, workflows, and processes for maintainability.


Required Skills & Qualifications

  • Technical Expertise:
  • Strong proficiency in Power Apps (Canvas & Model-Driven)Power AutomatePower BIPower Pages, and Dataverse.
  • Experience with Microsoft 365, Dynamics 365, and Azure services.
  • Knowledge of JavaScript, TypeScript, C#, .NET, and Power Fx for custom development.
  • Familiarity with SQL, DAX, and data modeling.
  • Additional Skills:
  • Understanding of ALM practicessolution packaging, and deployment pipelines.
  • Experience with Git, Azure DevOps, or similar tools for version control and CI/CD.
  • Strong problem-solving and analytical skills.
  • Certifications (Preferred):
  • Microsoft Certified: Power Platform Developer Associate.
  • Microsoft Certified: Power Platform Solution Architect Expert.


Soft Skills

  • Excellent communication and collaboration skills.
  • Ability to work in agile environments and manage multiple priorities.
  • Strong documentation and presentation abilities.

 

Read more
Upland Software

at Upland Software

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
5yrs+
Upto ₹33L / yr (Varies
)
skill icon.NET
SQL
Object Oriented Programming (OOPs)
Windows Azure
ASP.NET
+1 more

We are looking for an enthusiastic and dynamic individual to join Upland India as a Senior Software Engineer I (Backend) for our Panviva product. The individual will work with our global development team.


What would you do?

  • Develop, Review, test and maintain application code
  • Collaborating with other developers and product to fulfil objectives
  • Troubleshoot and diagnose issues
  • Take lead on tasks as needed
  • Jump in and help the team deliver features when it is required

What are we looking for?

Experience

  • 5 + years of experience in Designing and implementing application architecture
  • Back-end developer who enjoys solving problems
  • Demonstrated experience with the .NET ecosystem (.NET Framework, ASP.NET, .NET Core) & SQL server
  • Experience in building cloud-native applications (Azure)
  • Must be skilled at writing Quality, scalable, maintainable, testable code

Leadership Skills

  • Strong communication skills
  • Ability to mentor/lead junior developers


Primary Skills: The candidate must possess the following primary skills:

  • Strong Back-end developer who enjoys solving problems
  • Solid experience NET Core, SQL Server, and .Net Design patterns such as Strong Understanding of OOPs Principles, .net specific implementation (DI/CQRS/Repository etc., patterns) & Knowing Architectural Solid principles, Unit testing tools, Debugging techniques
  • Applying patterns to improve scalability and reduce technical debt
  • Experience with refactoring legacy codebases using design patterns
  • Real-World Problem Solving
  • Ability to analyze a problem and choose the most suitable design pattern
  • Experience balancing performance, readability, and maintainability
  • Experience building modern, scalable, reliable applications on the MS Azure cloud including services such as:
  • App Services
  • Azure Service Bus/ Event Hubs
  • Azure API Management Service Azure Bot Service
  • Function/Logic Apps
  • Azure key vault & Azure Configuration Service
  • CosmosDB, Mongo DB
  • Azure Search
  • Azure Cognitive Services

Understanding Agile Methodology and Tool Familiarity

  • Solid understanding of Agile development processes, including sprint planning, daily stand-ups, retrospectives, and backlog grooming
  • Familiarity with Agile tools such as JIRA for tracking tasks, managing workflows, and collaborating across teams
  • Experience working in cross-functional Agile teams and contributing to iterative development cycles

Secondary Skills: It would be advantageous if the candidate also has the following secondary skills:

  • Experience with front-end React/Jquery/Javascript, HTML and CSS Frameworks
  • APM tools - Worked on any tools such as Grafana, NR, Cloudwatch etc.,
  • Basic Understanding of AI models
  • Python

About Upland

Upland Software (Nasdaq: UPLD) helps global businesses accelerate digital transformation with a powerful cloud software library that provides choice, flexibility, and value. Upland India is a fully owned subsidiary of Upland Software and headquartered in Bangalore. We are a remote-first company. Interviews and on-boarding are conducted virtually.


Read more
venanalytics

at venanalytics

2 candid answers
Rincy jain
Posted by Rincy jain
Remote, Mumbai
3 - 4 yrs
₹7L - ₹10L / yr
skill iconPython
SQL
PowerBI
Client Servicing
Team Management
+6 more

About Ven Analytics


At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.


Role Overview


We’re looking for a Power BI Data Analyst who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL..


Key Responsibilities


  • Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.


  • Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.


  • Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.


  • Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.


  • Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.


  • Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.


  • Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.


  • Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.


  • Power BI Development: Use power BI desktop for report building and service for distribution 


  • Backend development: Develop optimized SQL queries that are easy to consume, maintain and debug.


  • Version Control: Strict control on versions by tracking CRs and Bugfixes. Ensuring the maintenance of Prod and Dev dashboards. 


  • Client Servicing : Engage with clients to understand their data needs, gather requirements, present insights, and ensure timely, clear communication throughout project cycles.


  • Team Management : Lead and mentor a small team by assigning tasks, reviewing work quality, guiding technical problem-solving, and ensuring timely delivery of dashboards and reports..


Must-Have Skills


  • Strong experience building robust data models in Power BI
  • Hands-on expertise with DAX (complex measures and calculated columns)
  • Proficiency in M Language (Power Query) beyond drag-and-drop UI
  • Clear understanding of data visualization best practices (less fluff, more insight)
  • Solid grasp of SQL and Python for data processing
  • Strong analytical thinking and ability to craft compelling data stories
  • Client Servicing Background.


Good-to-Have (Bonus Points)


  • Experience using DAX Studio and Tabular Editor
  • Prior work in a high-volume data processing production environment
  • Exposure to modern CI/CD practices or version control with BI tools

 

Why Join Ven Analytics?


  • Be part of a fast-growing startup that puts data at the heart of every decision.
  • Opportunity to work on high-impact, real-world business challenges.
  • Collaborative, transparent, and learning-oriented work environment.
  • Flexible work culture and focus on career development.


Read more
Ekloud INC
Kratika Agarwal
Posted by Kratika Agarwal
Remote only
8 - 14 yrs
₹7L - ₹18L / yr
m365
m365 developer
ms teams
MS SharePoint
Microsoft Exchange
+12 more

Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.

Need a strong understanding of the development lifecycle, and possess debugging skills time management, business acumen, and have a positive attitude is a must and open to continual growth.

Capability to code appropriate solutions will be tested in any interview.

Knowledge of a wide variety of Generative AI models

Conceptual understanding of how large language models work

Proficiency in coding languages for data manipulation (e.g., SQL) and machine learning & AI development (e.g., Python)

Experience with dashboarding tools such as Power BI and Tableau (beneficial but not essential)

Read more
Whiz IT Services
Sheeba Harish
Posted by Sheeba Harish
Remote only
10 - 15 yrs
₹20L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
API
Apache Kafka
+5 more

We are looking for highly experienced Senior Java Developers who can architect, design, and deliver high-performance enterprise applications using Spring Boot and Microservices . The role requires a strong understanding of distributed systems, scalability, and data consistency.

Read more
Forbes Advisor

at Forbes Advisor

3 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
4yrs+
Upto ₹40L / yr (Varies
)
skill iconPython
SQL
Database performance tuning
Data-flow analysis
Data modeling

About Forbes Advisor

Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.

We do this by combining data-driven content, rigorous product comparisons, and user-first design — all built on top of a modern, scalable platform. Our global teams bring deep expertise across journalism, product, performance marketing, data, and analytics.

 

The Role

We’re hiring a Data Scientist to help us unlock growth through advanced analytics and machine learning. This role sits at the intersection of marketing performance, product optimization, and decision science.


You’ll partner closely with Paid Media, Product, and Engineering to build models, generate insight, and influence how we acquire, retain, and monetize users. From campaign ROI to user segmentation and funnel optimization, your work will directly shape how we grow.This role is ideal for someone who thrives on business impact, communicates clearly, and wants to build re-usable, production-ready insights — not just run one-off analyses.

 

What You’ll Do

Marketing & Revenue Modelling

• Own end-to-end modelling of LTV, user segmentation, retention, and marketing

efficiency to inform media optimization and value attribution.

• Collaborate with Paid Media and RevOps to optimize SEM performance, predict high-

value cohorts, and power strategic bidding and targeting.

Product & Growth Analytics

• Work closely with Product Insights and General Managers (GMs) to define core metrics, KPIs, and success frameworks for new launches and features.

• Conduct deep-dive analysis of user behaviour, funnel performance, and product engagement to uncover actionable insights.

• Monitor and explain changes in key product metrics, identifying root causes and business impact.

• Work closely with Data Engineering to design and maintain scalable data pipelines that

support machine learning workflows, model retraining, and real-time inference.

Predictive Modelling & Machine Learning

• Build predictive models for conversion, churn, revenue, and engagement using regression, classification, or time-series approaches.

• Identify opportunities for prescriptive analytics and automation in key product and marketing workflows.

• Support development of reusable ML pipelines for production-scale use cases in product recommendation, lead scoring, and SEM planning.

Collaboration & Communication

• Present insights and recommendations to a variety of stakeholders — from ICs to executives — in a clear and compelling manner.

• Translate business needs into data problems, and complex findings into strategic action plans.

• Work cross-functionally with Engineering, Product, BI, and Marketing to deliver and deploy your work.

 

What You’ll Bring

Minimum Qualifications

• Bachelor’s degree in a quantitative field (Mathematics, Statistics, CS, Engineering, etc.).

• 4+ years in data science, growth analytics, or decision science roles.

• Strong SQL and Python skills (Pandas, Scikit-learn, NumPy).

• Hands-on experience with Tableau, Looker, or similar BI tools.

• Familiarity with LTV modelling, retention curves, cohort analysis, and media attribution.

• Experience with GA4, Google Ads, Meta, or other performance marketing platforms.

• Clear communication skills and a track record of turning data into decisions.


Nice to Have

• Experience with BigQuery and Google Cloud Platform (or equivalent).

• Familiarity with affiliate or lead-gen business models.

• Exposure to NLP, LLMs, embeddings, or agent-based analytics.

• Ability to contribute to model deployment workflows (e.g., using Vertex AI, Airflow, or Composer).

 

Why Join Us?

• Remote-first and flexible — work from anywhere in India with global exposure.

• Monthly long weekends (every third Friday off).

• Generous wellness stipends and parental leave.

• A collaborative team where your voice is heard and your work drives real impact.

• Opportunity to help shape the future of data science at one of the world’s most trusted

brands.

Read more
Tech AI startup in Bangalore

Tech AI startup in Bangalore

Agency job
via Recruit Square by Priyanka choudhary
Remote only
4 - 8 yrs
₹12L - ₹18L / yr
pandas
NumPy
MLOps
SQL
ETL
+1 more

Data Engineer – Validation & Quality


Responsibilities

  • Build rule-based and statistical validation frameworks using Pandas / NumPy.
  • Implement contradiction detection, reconciliation, and anomaly flagging.
  • Design and compute confidence metrics for each evidence record.
  • Automate schema compliance, sampling, and checksum verification across data sources.
  • Collaborate with the Kernel to embed validation results into every output artifact.

Requirements

  • 5 + years in data engineering, data quality, or MLOps validation.
  • Strong SQL optimization and ETL background.
  • Familiarity with data lineage, DQ frameworks, and regulatory standards (SOC 2 / GDPR).
Read more
Big Rattle Technologies
Sreelakshmi Nair (Big Rattle Technologies)
Posted by Sreelakshmi Nair (Big Rattle Technologies)
Remote, Mumbai
5 - 7 yrs
₹8L - ₹12L / yr
skill iconPython
SQL
skill iconMachine Learning (ML)
Data profiling
E2E
+8 more

Position: QA Engineer – Machine Learning Systems (5 - 7 years)

Location: Remote (Company in Mumbai)

Company: Big Rattle Technologies Private Limited


Immediate Joiners only.


Summary:

The QA Engineer will own quality assurance across the ML lifecycle—from raw data validation through feature engineering checks, model training/evaluation verification, batch prediction/optimization validation, and end-to-end (E2E) workflow testing. The role is hands-on with Python automation, data profiling, and pipeline test harnesses in Azure ML and Azure DevOps. Success means probably correct data, models, and outputs at production scale and cadence.


Key Responsibilities:

Test Strategy & Governance

  • ○ Define an ML-specific Test Strategy covering data quality KPIs, feature consistency
  • checks, model acceptance gates (metrics + guardrails), and E2E run acceptance
  • (timeliness, completeness, integrity).
  • ○ Establish versioned test datasets & golden baselines for repeatable regression of
  • features, models, and optimizers.


Data Quality & Transformation

  • Validate raw data extracts and landed data lake data: schema/contract checks, null/outlier thresholds, time-window completeness, duplicate detection, site/material coverage.
  • Validate transformed/feature datasets: deterministic feature generation, leakage detection, drift vs. historical distributions, feature parity across runs (hash or statistical similarity tests).
  • Implement automated data quality checks (e.g., Great Expectations/pytest + Pandas/SQL) executed in CI and AML pipelines.

Model Training & Evaluation

  • Verify training inputs (splits, windowing, target leakage prevention) and hyperparameter configs per site/cluster.
  • Automate metric verification (e.g., MAPE/MAE/RMSE, uplift vs. last model, stability tests) with acceptance thresholds and champion/challenger logic.
  • Validate feature importance stability and sensitivity/elasticity sanity checks (price/volume monotonicity where applicable).
  • Gate model registration/promotion in AML based on signed test artifacts and reproducible metrics.


Predictions, Optimization & Guardrails

  • Validate batch predictions: result shapes, coverage, latency, and failure handling.
  • Test model optimization outputs and enforced guardrails: detect violations and prove idempotent writes to DB.
  • Verify API push to third party system (idempotency keys, retry/backoff, delivery receipts).


Pipelines & E2E

  • Build pipeline test harnesses for AML pipelines (data-gen nightly, training weekly,
  • prediction/optimization) including orchestrated synthetic runs and fault injection
  • (missing slice, late competitor data, SB backlog).
  • Run E2E tests from raw data store -> ADLS -> AML -> RDBMS -> APIM/Frontend, assert
  • freshness SLOs and audit event completeness (Event Hubs -> ADLS immutable).


Automation & Tooling

  • Develop Python-based automated tests (pytest) for data checks, model metrics, and API contracts; integrate with Azure DevOps (pipelines, badges, gates).
  • Implement data-driven test runners (parameterized by site/material/model-version) and store signed test artifacts alongside models in AML Registry.
  • Create synthetic test data generators and golden fixtures to cover edge cases (price gaps, competitor shocks, cold starts).


Reporting & Quality Ops

  • Publish weekly test reports and go/no-go recommendations for promotions; maintain a defect taxonomy (data vs. model vs. serving vs. optimization).
  • Contribute to SLI/SLO dashboards (prediction timeliness, queue/DLQ, push success, data drift) used for release gates.


Required Skills (hands-on experience in the following):

  • Python automation (pytest, pandas, NumPy), SQL (PostgreSQL/Snowflake), and CI/CD (Azure
  • DevOps) for fully automated ML QA.
  • Strong grasp of ML validation: leakage checks, proper splits, metric selection
  • (MAE/MAPE/RMSE), drift detection, sensitivity/elasticity sanity checks.
  • Experience testing AML pipelines (pipelines/jobs/components), and message-driven integrations
  • (Service Bus/Event Hubs).
  • API test skills (FastAPI/OpenAPI, contract tests, Postman/pytest-httpx) + idempotency and retry
  • patterns.
  • Familiar with feature stores/feature engineering concepts and reproducibility.
  • Solid understanding of observability (App Insights/Log Analytics) and auditability requirements.


Required Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
  • 5–7+ years in QA with 3+ years focused on ML/Data systems (data pipelines + model validation).
  • Certification in Azure Data or ML Engineer Associate is a plus.



Why should you join Big Rattle?

Big Rattle Technologies specializes in AI/ ML Products and Solutions as well as Mobile and Web Application Development. Our clients include Fortune 500 companies. Over the past 13 years, we have delivered multiple projects for international and Indian clients from various industries like FMCG, Banking and Finance, Automobiles, Ecommerce, etc. We also specialise in Product Development for our clients.

Big Rattle Technologies Private Limited is ISO 27001:2022 certified and CyberGRX certified.

What We Offer:

  • Opportunity to work on diverse projects for Fortune 500 clients.
  • Competitive salary and performance-based growth.
  • Dynamic, collaborative, and growth-oriented work environment.
  • Direct impact on product quality and client satisfaction.
  • 5-day hybrid work week.
  • Certification reimbursement.
  • Healthcare coverage.

How to Apply:

Interested candidates are invited to submit their resume detailing their experience. Please detail out your work experience and the kind of projects you have worked on. Ensure you highlight your contributions and accomplishments to the projects.


Read more
Bidgely

at Bidgely

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
10yrs+
Upto ₹100L / yr (Varies
)
skill iconJava
Microservices
Distributed Systems
SQL
skill iconAmazon Web Services (AWS)

Bidgely is seeking an outstanding and deeply technical Principal Engineer / Sr. Principal Engineer / Architect to lead the architecture and evolution of our next-generation data and platform infrastructure. This is a senior IC role for someone who loves solving complex problems at scale, thrives in high-ownership environments, and influences engineering direction across teams.

You will be instrumental in designing scalable and resilient platform components that can handle trillions of data points, integrate machine learning pipelines, and support advanced energy analytics. As we evolve our systems for the future of clean energy, you will play a critical role in shaping the platform that powers all Bidgely products.


Responsibilities

  • Architect & Design: Lead the end-to-end architecture of core platform components – from ingestion pipelines to ML orchestration and serving layers. Architect for scale (200Bn+ daily data points), performance, and flexibility.
  • Technical Leadership: Act as a thought leader and trusted advisor for engineering teams. Review designs, guide critical decisions, and set high standards for software engineering excellence.
  • Platform Evolution: Define and evolve the platform’s vision, making key choices in data processing, storage, orchestration, and cloud-native patterns.
  • Mentorship: Coach senior engineers and staff on architecture, engineering best practices, and system thinking. Foster a culture of engineering excellence and continuous improvement.
  • Innovation & Research: Evaluate and experiment with emerging technologies (e.g., event-driven architectures, AI infrastructure, new cloud-native tools) to stay ahead of the curve.
  • Cross-functional Collaboration: Partner with Engineering Managers, Product Managers, and Data Scientists to align platform capabilities with product needs.
  • Non-functional Leadership: Ensure systems are secure, observable, resilient, performant, and cost-efficient. Drive excellence in areas like compliance, DevSecOps, and cloud cost optimization.
  • GenAI Integration: Explore and drive adoption of Generative AI to enhance developer productivity, platform intelligence, and automation of repetitive engineering tasks.


Requirements:

  • 8+ years of experience in backend/platform architecture roles, ideally with experience at scale.
  • Deep expertise in distributed systems, data engineering stacks (Kafka, Spark, HDFS, NoSQL DBs like Cassandra/ElasticSearch), and cloud-native infrastructure (AWS, GCP, or Azure).
  • Proven ability to architect high-throughput, low-latency systems with batch + real-time processing.
  • Experience designing and implementing DAG-based data processing and orchestration systems.
  • Proficient in Java (Spring Boot, REST), and comfortable with infrastructure-as-code and CI/CD practices.
  • Strong understanding of non-functional areas: security, scalability, observability, and
  • compliance.
  • Exceptional problem-solving skills and a data-driven approach to decision-making.
  • Excellent communication and collaboration skills with the ability to influence at all levels.
  • Prior experience working in a SaaS environment is a strong plus.
  • Experience with GenAI tools or frameworks (e.g., LLMs, embedding models, prompt engineering, RAG, Copilot-like integrations) to accelerate engineering workflows or enhance platform intelligence is highly desirable.
Read more
Intineri infosol Pvt Ltd

at Intineri infosol Pvt Ltd

2 candid answers
Shivani Pandey
Posted by Shivani Pandey
Remote only
4 - 6 yrs
₹5L - ₹12L / yr
skill iconJavascript
Glide Script
JSON
SQL
ServiceNow
+3 more

Role Overview

 

We are seeking a ServiceNow Product Owner with deep expertise in ServiceNow modules (CSM, ITSM, HRSD)

and strong scripting and data-handling skills.

 

This role focuses on translating real enterprise workflows into structured, data-driven AI training tasks, helping improve reasoning and understanding within AI systems. It is not a platform configuration or app development role — instead, it blends functional ServiceNow knowledge, prompt engineering, and data design to build the next generation of intelligent enterprise models.

 

Key Responsibilities

 

·    Define decision frameworks and realistic scenarios for AI reinforcement learning based on ServiceNow workflows.

·    Design scenario-driven tasks mirroring ServiceNow processes like case handling, SLA tracking, and IT incident management.

·    Develop and validate structured data tasks in JSON, ensuring accuracy and clarity.

·    Write natural language instructions aligned with ServiceNow’s business logic and workflows.

·    Use SQL queries for validation and quality checks of task data.

·    Apply prompt engineering techniques to guide model reasoning.

·    Collaborate with peers to expand and document cross-domain scenarios (CSM, ITSM, HRSD).

·    Create and maintain documentation of scenario patterns and best practices.

 

Required Experience

 

·    4–6 years of experience with ServiceNow (CSM, ITSM, HRSD).

·    Deep understanding of cases, incidents, requests, SLAs, and knowledge management processes.

·    Proven ability to design realistic enterprise scenarios mapping to ServiceNow operations.

·    Exposure to AI model training workflows or structured data design is a plus.

 

Preferred Qualifications

 

·    ServiceNow Certified System Administrator (CSA)

·    ServiceNow Certified Implementation Specialist (CIS-ITSM / CSM / HRSD)

·    Exposure to AI/ML workflows or model training data preparation.

·    Excellent written and verbal communication skills, with client-facing 


Mandatory Skills: Scripting (Javascript, Glide Script), JSON Handling, SQL, Service Now Modules (ITSM, CSM, HRSD) and Prompt Engineering.

Read more
Remote only
2 - 4 yrs
₹4L - ₹8L / yr
skill iconPython
JSON
LLMS
oops
skill iconJava
+4 more

Role Overview

We are seeking a Junior Developer with 1-3 year’s experience with strong foundations in Python, databases, and AI technologies. The ideal candidate will support the development of AI-powered solutions, focusing on LLM integration, prompt engineering, and database-driven workflows. This is a hands-on role with opportunities to learn and grow into advanced AI engineering responsibilities.

Key Responsibilities

  • Develop, test, and maintain Python-based applications and APIs.
  • Design and optimize prompts for Large Language Models (LLMs) to improve accuracy and performance.
  • Work with JSON-based data structures for request/response handling.
  • Integrate and manage PostgreSQL (pgSQL) databases, including writing queries and handling data pipelines.
  • Collaborate with the product and AI teams to implement new features.
  • Debug, troubleshoot, and optimize performance of applications and workflows.
  • Stay updated on advancements in LLMs, AI frameworks, and generative AI tools.

Required Skills & Qualifications

  • Strong knowledge of Python (scripting, APIs, data handling).
  • Basic understanding of Large Language Models (LLMs) and prompt engineering techniques.
  • Experience with JSON data parsing and transformations.
  • Familiarity with PostgreSQL or other relational databases.
  • Ability to write clean, maintainable, and well-documented code.
  • Strong problem-solving skills and eagerness to learn.
  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).

Nice-to-Have (Preferred)

  • Exposure to AI/ML frameworks (e.g., LangChain, Hugging Face, OpenAI APIs).
  • Experience working in startups or fast-paced environments.
  • Familiarity with version control (Git/GitHub) and cloud platforms (AWS, GCP, or Azure).

What We Offer

  • Opportunity to work on cutting-edge AI applications in permitting & compliance.
  • Collaborative, growth-focused, and innovation-driven work culture.
  • Mentorship and learning opportunities in AI/LLM development.
  • Competitive compensation with performance-based growth.


Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Remote only
8 - 12 yrs
₹20L - ₹26L / yr
skill icon.NET
ASP.NET
skill iconC#
skill iconAngular (2+)
skill iconJavascript
+7 more

We are seeking a highly skilled and experienced Senior Full Stack Developerwith 8+years of experience to join our dynamic team. The ideal candidate will have a strong background in both front-end and back-end development, with expertise in .NET, Angular, TypeScript, Azure, SQL Server, Agile methodologies, and Design Patterns. Experience with DocuSign is a plus.


Responsibilities:

  • Design, develop, and maintain web applications using .NET, Angular, and TypeScript.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement and maintain cloud-based solutions using Azure.
  • Develop and optimize SQL Server databases.
  • Follow Agile methodologies to manage project tasks and deliverables.
  • Apply design patterns and best practices to ensure high-quality, maintainable code.
  • Troubleshoot and resolve software defects and issues.
  • Mentor and guide junior developers.

Requirements:

  • Bachelor's degree in computer science, Engineering, or a related field.
  • Proven experience as a Full Stack Developer or similar role.
  • Strong proficiency in .NET, Angular, and TypeScript.
  • Experience with Azure cloud services.
  • Proficient in SQL Server and database design.
  • Familiarity with Agile methodologies and practices.
  • Solid understanding of design patterns and software architecture principles.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and teamwork abilities.
  • Experience with DocuSign is a plus.


Read more
Remote only
10 - 15 yrs
₹25L - ₹40L / yr
data engineer
Apache Spark
skill iconScala
Big Data
skill iconPython
+5 more

What You’ll Be Doing:

● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines

and platforms.

● Lead and mentor a team of data engineers while establishing engineering best practices,

coding standards, and governance models.

● Design and implement high-performance ETL/ELT pipelines using modern Big Data

technologies for diverse internal and external data sources.

● Drive modernization initiatives including re-architecting legacy systems to support

next-generation data products, ML workloads, and analytics use cases.

● Partner with Product, Engineering, and Business teams to translate requirements into

robust technical solutions that align with organizational priorities.

● Champion data quality, monitoring, metadata management, and observability across the

ecosystem.

● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and

infrastructure scalability.

● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,

and cloud-based architecture improvements.


Qualifications:

● Bachelor's degree in Engineering, Computer Science, or relevant field.

● 8+ years of relevant and recent experience in a Data Engineer role.

● 5+ years recent experience with Apache Spark and solid understanding of the

fundamentals.

● Deep understanding of Big Data concepts and distributed systems.

● Demonstrated ability to design, review, and optimize scalable data architectures across

ingestion.

● Strong coding skills with Scala, Python and the ability to quickly switch between them with

ease.

● Advanced working SQL knowledge and experience working with a variety of relational

databases such as Postgres and/or MySQL.

● Cloud Experience with DataBricks.


● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,

and similar formats.

● Experience establishing and enforcing data engineering best practices, including CI/CD

for data, orchestration and automation, and metadata management.

● Comfortable working in an Agile environment

● Machine Learning knowledge is a plus.

● Demonstrated ability to operate independently, take ownership of deliverables, and lead

technical decisions.

● Excellent written and verbal communication skills in English.

● Experience supporting and working with cross-functional teams in a dynamic

environment.

REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as

assigned by Management.

EMPLOYMENT TYPE: Full-Time, Permanent


SHIFT TIMINGS: 10:00 AM - 07:00 PM IST

Read more
Deltek
Remote only
7 - 12 yrs
Best in industry
skill icon.NET
skill iconC#
SQL
Artificial Intelligence (AI)
Web Development
+3 more


Sr Software Engineer


Company Summary :


As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference.


At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com

Business Summary :


The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.

Position Responsibilities :


About the Role

We are looking for a skilled and motivated Senior Software Developer to join our team responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and more than 30000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.

This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.

Key Responsibilities

  • Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
  • Optimize and manage SQL Server database interactions for performance and scalability
  • Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
  • Participate in code reviews, architecture discussions, and technical planning
  • Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
  • Troubleshoot and resolve complex technical issues across the stack
  • Ensure code quality, maintainability, and adherence to best practices
  • Stay current with emerging technologies and recommend improvements where applicable


Qualifications

  • Curiosity, passion, teamwork, and initiative
  • Strong experience with C# and .NET Core in enterprise application development
  • Solid understanding of SQL Server, including query optimization and schema design
  • Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
  • Ability to utilize agentic AI as a development support, with a critical thinking attitude
  • Familiarity with agile development methodologies and DevOps practices
  • Ability to work independently and collaboratively in a fast-paced environment
  • Excellent problem-solving and communication skills
  • Master's degree in Computer Science or equivalent; 5+ years of relevant work experience
  • Experience with ERP systems or other complex business applications is a plus

What We Offer

  • A chance to work on a product that directly impacts thousands of users worldwide
  • A collaborative and supportive engineering culture
  • Opportunities for professional growth and technical leadership
  • Competitive salary and benefits package
Read more
Remote only
1 - 3 yrs
₹3L - ₹5L / yr
skill iconHTML/CSS
skill iconJavascript
skill iconBootstrap
skill iconPHP
skill iconCodeIgniter
+1 more

Position: Full Stack Developer ( PHP Codeigniter)

Company : Mayura Consultancy Services

Experience: 2 yrs

Location : Bangalore

Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI

Work Location: Work From Home(WFH)

Website : https://www.mayuraconsultancy.com/


Requirements :

  • Prior experience in Full Stack Development using PHP Codeigniter


Perks of Working with MCS :

  • Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
  • Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
  • Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
  • Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
  • Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.


Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.


Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Remote only
4 - 7 yrs
₹8L - ₹11L / yr
skill iconPHP
skill iconLaravel
SQL
MySQL
Object Oriented Programming (OOPs)
+1 more

Job Title: PHP Coordinator / Laravel Developer

Experience: 4+ Years

Work Mode: Work From Home (WFH)

Working Days: 5 Days


Job Description:

We are looking for an experienced PHP Coordinator / Laravel Developer to join our team. The ideal candidate should have strong expertise in PHP and Laravel framework, along with the ability to coordinate and manage development (as Team Lead) tasks effectively.

Key Responsibilities:

  • Develop, test, and maintain web applications using PHP and Laravel.
  • Coordinate with team members to ensure timely project delivery.
  • Write clean, secure, and efficient code.
  • Troubleshoot, debug, and optimize existing applications.
  • Collaborate with stakeholders to gather and analyze requirements.

Required Skills:

  • Strong experience in PHP and Laravel framework.
  • Good understanding of MySQL and RESTful APIs and Cloud (AWS/ Azure/ GCP).
  • Familiarity with front-end technologies (HTML, CSS, JavaScript).
  • Excellent communication and coordination skills.
  • Ability to work independently in a remote environment.


Read more
Zonecheck
Niranjan G
Posted by Niranjan G
Remote, Prabhadevi
1 - 2 yrs
₹2.5L - ₹3L / yr
skill iconReact Native
skill iconReact.js
skill iconJavascript
SQL

Tech Stack / Requirements:

  1. Experience required: 1 - 2 yrs atleast
  2. Candidates must be from an IT Engineering background (B.E./B.Tech in Information Technology, Computer Science, or related fields), B.Sc. IT, BCA or related fields.
  3. Strong understanding of JavaScript
  4. Experience with React Native / Expo
  5. Familiarity with SQL
  6. Exposure to REST APIs integration
  7. Fast learner with strong problem-solving & debugging skills


Responsibilities:

  1. Build & improve mobile app features using React Native / Expo
  2. Develop and maintain web features using React.js / Next.js
  3. Integrate APIs and ensure seamless user experiences across platforms
  4. Collaborate with backend & design teams for end-to-end development
  5. Debug & optimize performance across mobile and web
  6. Write clean, maintainable code and ship to production regularly


Work closely with the founding team / CTO and contribute to product launches


Growth: Performance-based growth with significant hikes possible in the same or upcoming months.

Read more
Remote only
6 - 10 yrs
₹8L - ₹15L / yr
Informatica IICS/IDMC
Informatica PowerCenter
ETL
SQL
Data migration
+1 more

Job Title : Informatica Cloud Developer / Migration Specialist

Experience : 6 to 10 Years

Location : Remote

Notice Period : Immediate


Job Summary :

We are looking for an experienced Informatica Cloud Developer with strong expertise in Informatica IDMC/IICS and experience in migrating from PowerCenter to Cloud.

The candidate will be responsible for designing, developing, and maintaining ETL workflows, data warehouses, and performing data integration across multiple systems.


Mandatory Skills :

Informatica IICS/IDMC, Informatica PowerCenter, ETL Development, SQL, Data Migration (PowerCenter to IICS), and Performance Tuning.


Key Responsibilities :

  • Design, develop, and maintain ETL processes using Informatica IICS/IDMC.
  • Work on migration projects from Informatica PowerCenter to IICS Cloud.
  • Troubleshoot and resolve issues related to mappings, mapping tasks, and taskflows.
  • Analyze business requirements and translate them into technical specifications.
  • Conduct unit testing, performance tuning, and ensure data quality.
  • Collaborate with cross-functional teams for data integration and reporting needs.
  • Prepare and maintain technical documentation.

Required Skills :

  • 4 to 5 years of hands-on experience in Informatica Cloud (IICS/IDMC).
  • Strong experience with Informatica PowerCenter.
  • Proficiency in SQL and data warehouse concepts.
  • Good understanding of ETL performance tuning and debugging.
  • Excellent communication and problem-solving skills.
Read more
KDK Softwares

at KDK Softwares

1 recruiter
Priyanka Khandelwal
Posted by Priyanka Khandelwal
Remote, Jaipur
5 - 10 yrs
₹5L - ₹12L / yr
SQL
skill iconJavascript
skill iconAmazon Web Services (AWS)
SQL Azure
HTTP
+1 more

Role & responsibilities


  • Develop and maintain server-side applications using Go Lang.
  • Design and implement scalable, secure, and maintainable RESTful APIs and microservices.
  • Collaborate with front-end developers to integrate user-facing elements with server-side logic
  • Optimize applications for performance, reliability, and scalability.
  • Write clean, efficient, and reusable code that adheres to best practices.


Preferred candidate profile


  • Minimum 5 years of working experience in Go Lang development.
  • Proven experience in developing RESTful APIs and microservices.
  • Familiarity of cloud platforms like AWS, GCP, or Azure.
  • Familiarity with CI/CD pipelines and DevOps practices


Read more
Remote only
10 - 17 yrs
₹20L - ₹30L / yr
Apache Spark
Big Data
skill iconScala
skill iconPython
databricks
+1 more

Position: Senior Data Engineer


Overview:

We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure to support cross-functional teams and next-generation data initiatives. The ideal candidate is a hands-on data expert with strong technical proficiency in Big Data technologies and a passion for developing efficient, reliable, and future-ready data systems.


Reporting: Reports to the CEO or designated Lead as assigned by management.

Employment Type: Full-time, Permanent

Location: Remote (Pan India)

Shift Timings: 2:00 PM – 11:00 PM IST


Key Responsibilities:

  • Design and develop scalable data pipeline architectures for data extraction, transformation, and loading (ETL) using modern Big Data frameworks.
  • Identify and implement process improvements such as automation, optimization, and infrastructure re-design for scalability and performance.
  • Collaborate closely with Engineering, Product, Data Science, and Design teams to resolve data-related challenges and meet infrastructure needs.
  • Partner with machine learning and analytics experts to enhance system accuracy, functionality, and innovation.
  • Maintain and extend robust data workflows and ensure consistent delivery across multiple products and systems.


Required Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or related field.
  • 10+ years of hands-on experience in Data Engineering.
  • 5+ years of recent experience with Apache Spark, with a strong grasp of distributed systems and Big Data fundamentals.
  • Proficiency in Scala, Python, Java, or similar languages, with the ability to work across multiple programming environments.
  • Strong SQL expertise and experience working with relational databases such as PostgreSQL or MySQL.
  • Proven experience with Databricks and cloud-based data ecosystems.
  • Familiarity with diverse data formats such as Delta Tables, Parquet, CSV, and JSON.
  • Skilled in Linux environments and shell scripting for automation and system tasks.
  • Experience working within Agile teams.
  • Knowledge of Machine Learning concepts is an added advantage.
  • Demonstrated ability to work independently and deliver efficient, stable, and reliable software solutions.
  • Excellent communication and collaboration skills in English.



About the Organization:

We are a leading B2B data and intelligence platform specializing in high-accuracy contact and company data to empower revenue teams. Our technology combines human verification and automation to ensure exceptional data quality and scalability, helping businesses make informed, data-driven decisions.


What We Offer:

Our workplace embraces diversity, inclusion, and continuous learning. With a fast-paced and evolving environment, we provide opportunities for growth through competitive benefits including:

  • Paid Holidays and Leaves
  • Performance Bonuses and Incentives
  • Comprehensive Medical Policy
  • Company-Sponsored Training Programs

We are an Equal Opportunity Employer, committed to maintaining a workplace free from discrimination and harassment. All employment decisions are made based on merit, competence, and business needs.

Read more
Netroadshow

Netroadshow

Agency job
via MWIDM by Priyanka Maurya
Remote only
5 - 9 yrs
₹22L - ₹30L / yr
skill icon.NET
skill iconC#
Microservices
ASP.NET
Unit testing
+2 more

Required Skills: 

  • 4+ years of experience designing, developing, and implementing enterprise-level, n-tier, software solutions.  
  • Proficiency with Microsoft C# is a must.  
  • In-depth experience with .NET framework and .NET Core.  
  • Knowledge of OOP, server technologies, and SOA is a must. 3+ Years Micro-service experience .  
  • Relevant experience with database design and SQL (Postgres is preferred).  
  • Experience with ORM tooling.  
  • Experience delivering software that is correct, stable, and security compliant.  
  • Basic understanding of common cloud platform. (Good to have)
  • Financial services experience is strongly preferred.  
  • Thorough understanding of XML/JSON and related technologies.  
  • Thorough understanding of unit, integration, and performance testing for APIs.   
  • Entrepreneurial spirit. You are self-directed, innovative, and biased towards action. You love to build new things and thrive in fast-paced environments.  
  • Excellent communication and interpersonal skills, with an emphasis on strong writing and analytical problem-solving. 



Read more
Nyx Wolves
Remote only
5 - 7 yrs
₹11L - ₹13L / yr
SQL
Data modeling
Web performance optimization
Data engineering

Now Hiring: Tableau Developer (Banking Domain) 🚀

We’re looking for a 6+ years experienced Tableau pro to design and optimize dashboards for Banking & Financial Services.


🔹 Design & optimize interactive Tableau dashboards for large banking datasets

🔹 Translate KPIs into scalable reporting solutions

🔹 Ensure compliance with regulations like KYC, AML, Basel III, PCI-DSS

🔹 Collaborate with business analysts, data engineers, and banking experts

🔹 Bring deep knowledge of SQL, data modeling, and performance optimization


🌍 Location: Remote

📊 Domain Expertise: Banking / Financial Services


✨ Preferred experience with cloud data platforms (AWS, Azure, GCP) & certifications in Tableau are a big plus!


Bring your data visualization skills to transform banking intelligence & compliance reporting.


Read more
Remote only
0 - 0 yrs
₹3000 - ₹3500 / mo
SQL

About the Role

We are seeking motivated Data Engineering Interns to join our team remotely for a 3-month internship. This role is designed for students or recent graduates interested in working with data pipelines, ETL processes, and big data tools. You will gain practical experience in building scalable data solutions. While this is an unpaid internship, interns who successfully complete the program will receive a Completion Certificate and a Letter of Recommendation.

Responsibilities

  • Assist in designing and building data pipelines for structured and unstructured data.
  • Support ETL (Extract, Transform, Load) processes to prepare data for analytics.
  • Work with databases (SQL/NoSQL) for data storage and retrieval.
  • Help optimize data workflows for performance and scalability.
  • Collaborate with data scientists and analysts to ensure data quality and consistency.
  • Document workflows, schemas, and technical processes.

Requirements

  • Strong interest in data engineering, databases, and big data systems.
  • Basic knowledge of SQL and relational database concepts.
  • Familiarity with Python, Java, or Scala for data processing.
  • Understanding of ETL concepts and data pipelines.
  • Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.
  • Familiarity with big data frameworks (Hadoop, Spark, Kafka) is an advantage.
  • Good problem-solving skills and ability to work independently in a remote setup.

What You’ll Gain

  • Hands-on experience in data engineering and ETL pipelines.
  • Exposure to real-world data workflows.
  • Mentorship and guidance from experienced engineers.
  • Completion Certificate upon successful completion.
  • Letter of Recommendation based on performance.

Internship Details

  • Duration: 3 months
  • Location: Remote (Work from Home)
  • Stipend: Unpaid
  • Perks: Completion Certificate + Letter of Recommendation


Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Remote, Pune
4 - 9 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconPython
SQL
PySpark
XGBoost

About Data Axle:

Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.


Data Axle Pune is pleased to have achieved certification as a Great Place to Work!


Roles & Responsibilities:

We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.


We are looking for a Senior Data Scientist who will be responsible for:

  1. Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
  2. Design or enhance ML workflows for data ingestion, model design, model inference and scoring
  3. Oversight on team project execution and delivery
  4. If senior, establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
  5. Visualize and publish model performance results and insights to internal and external audiences


Qualifications:

  1. Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
  2. Minimum of 3.5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
  3. Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
  4. Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
  5. Proficiency in Python and SQL required; PySpark/Spark experience a plus
  6. Ability to conduct a productive peer review and proper code structure in Github
  7. Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
  8. Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.


It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Cravingcode Technologies Pvt Ltd
Didhiti Dasgupta
Posted by Didhiti Dasgupta
Remote only
2 - 5 yrs
₹5L - ₹9L / yr
skill icon.NET
skill iconAngular (2+)
Web API
SQL
Entity Framework

Job Description: .NET + Angular Full Stack Developer

Position: Full Stack Developer (.NET + Angular)

Experience: 3 – 5 Years


About the Role

We are looking for a highly skilled .NET Angular Full Stack Developer to join our dynamic team. The ideal candidate should have strong expertise in both back-end and front-end development, hands-on experience with .NET Core and Angular, and a passion for building scalable, secure, and high-performance applications.


Key Responsibilities

  • Design, develop, and maintain scalable, high-quality web applications using .NET Core 8, ASP.NET MVC, Web API, and Angular 13+.
  • Build and integrate RESTful APIs and ensure seamless communication between front-end and back-end services.
  • Develop, optimize, and maintain SQL Server (2012+) databases, ensuring high availability, performance, and reliability.
  • Write complex stored procedures, functions, triggers, and perform query tuning and indexing for performance optimization.
  • Work with Entity Framework/EF Core to implement efficient data access strategies.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Implement OAuth 2.0 authentication/authorization for secure access control.
  • Write clean, testable, and maintainable code following Test-Driven Development (TDD) principles.
  • Use GIT / TFVC for version control and collaborate using Azure DevOps Services for CI/CD pipelines.
  • Participate in code reviews, troubleshoot issues, and optimize application performance.
  • Stay updated with emerging technologies and recommend improvements to enhance system architecture.


Required Technical Skills

  • 3+ years of experience in .NET development (C#, .NET Core 8, ASP.NET MVC, Web API).
  • Strong experience in SQL Server development including:
  • Query tuning, execution plan analysis, and performance optimization.
  • Designing and maintaining indexes, partitioning strategies, and database normalization.
  • Handling large datasets and optimizing stored procedures for scalability.
  • Experience with SQL Profiler, Extended Events, and monitoring tools.
  • Proficiency in Entity Framework / EF Core for ORM-based development.
  • Familiarity with PostgreSQL and cross-database integration is a plus.
  • Expertise in Angular 13+, HTML5, CSS, TypeScript, JavaScript, and Bootstrap.
  • Experience with REST APIs development and integration.
  • Knowledge of OAuth 2.0 and secure authentication methods.
  • Hands-on experience with GIT/TFVC and Azure DevOps for source control and CI/CD pipelines.
  • Basic knowledge of Node.js framework is a plus.
  • Experience with unit testing frameworks like NUnit, MSTest, etc.


Soft Skills

  • Strong problem-solving and analytical skills, particularly in debugging performance bottlenecks.
  • Excellent communication and collaboration abilities.
  • Ability to work independently and in a team environment.
  • Attention to detail and a passion for writing clean, scalable, and optimized code.


Read more
AT
Remote only
5 - 10 yrs
₹5L - ₹10L / yr
skill iconReact.js
skill iconNextJs (Next.js)
skill iconNodeJS (Node.js)
skill iconRedux/Flux
TypeScript
+4 more

Full Stack Engineer (Frontend Strong, Backend Proficient)

5-10 Years Experience

Contract: 6months+extendable

Location: Remote

Technical Requirements Frontend Expertise (Strong)


*Need at least 4 Yrs in React web developement, Node & AI.*


● Deep proficiency in React, Next.js, TypeScript

● Experience with state management (Redux, Context API)

● Frontend testing expertise (Jest, Cypress)

● Proven track record of achieving high Lighthouse performance scores Backend Proficiency

● Solid experience with Node.js, NestJS (preferred), or ExpressJS

● Database management (SQL, NoSQL)

● Cloud technologies experience (AWS, Azure)

● Understanding of OpenAI and AI integration capabilities (bonus) Full Stack Integration

● Excellent ability to manage and troubleshoot integration issues between frontend and backend systems

● Experience designing cohesive systems with proper separation of concerns

Read more
SupplyHouse
Susannah York
Posted by Susannah York
Remote only
3 - 6 yrs
₹22L - ₹28L / yr
skill iconJava
skill iconSpring Boot
SQL

Real people. Real service.


At SupplyHouse.com, we value every individual team member and cultivate a community where people come first. Led by our core values of Generosity, Respect, Innovation, Teamwork, and GRIT, we’re dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers.


Through an Employer of Record (EOR), we are looking for a new, remote Backend Engineer in India to join our growing IT Team. This individual will report into our Full Stack Team Lead and have the opportunity to work on impactful projects that enhance our e-commerce platform and internal operations, while honing your skills in backend and full stack development. If you’re passionate about creating user-friendly interfaces, building scalable systems, and contributing to innovative solutions in a collaborative and fun environment, we’d love to hear from you! 


Role Type: Full-Time

Location: Remote from India

Schedule: Monday through Friday, 4:00 a.m. – 1:00 p.m. U.S. Eastern Time / 12:00 p.m. – 9:00 p.m. Indian Standard Time to ensure effective collaboration 

Base Salary: $25,000 - $30,000 USD per year


Responsibilities:

  • Collaborate with cross-functional teams to gather and refine requirements, ensuring alignment with business needs.
  • Design, develop, test, deploy, and maintain scalable, high-performance software applications.
  • Develop and enhance internal tools and applications to improve company operations.
  • Ensure system reliability, optimize application performance, and implement best practices for scalability.
  • Continuously improve existing codebases, conducting code reviews, and implementing modern practices.
  • Stay up to date with emerging technologies, trends, and best practices in software development.


Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
  • 3+ years of hands-on experience in backend and/or full-stack development with a proven track record of delivering high-quality software.

Back-End Skills:

  • Proficiency in Java and experience with back-end frameworks like Spring Boot.
  • Strong understanding of database design, RDBMS concepts, and experience with SQL.
  • Knowledge of RESTful API design and integration.

Development Lifecycle: Proven ability to contribute across the entire software development lifecycle, including planning, design, coding, testing, deployment, and maintenance.

Tools & Practices:

  • Familiarity with version control systems, like Git, and CI/CD pipelines.
  • Experience with agile development methodologies.

Additional Skills:

  • Strong problem-solving and debugging capabilities.
  • Ability to create reusable code libraries and write clean, maintainable code.
  • Strong communication and collaboration skills to work effectively within a team and across departments.
  • High-level proficiency of written and verbal communication in English.


Preferred Qualifications:

  • Proficiency in HTML5, CSS3, JavaScript (ES6+), and responsive design principles.
  • Expertise in modern JavaScript frameworks and libraries such as React, Angular, or Vue.js.
  • Experience with cross-browser compatibility and performance optimization techniques. 
  • Experience working on Frontend responsibilities such as: 
  • Designing and implementing reusable, maintainable UI components and templates.
  • Working closely with Designers to ensure technical feasibility and adherence to UI/UX design standards.
  • Managing and updating promotional banners and site-wide templates to ensure timely execution of marketing initiatives.


Why work with us: 

  • We have awesome benefits – We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and life insurance options; Competitive Provident Fund contributions; Paid casual and sick leave, plus country-specific holidays; Mental health support and wellbeing program; Company-provided equipment and one-time $250 USD work from home stipend; $750 USD annual professional development budget; Company rewards and recognition program; And more!
  • We promote work-life balance – We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our 100% remote schedule and wellness initiatives! 
  • We support growth– We strive to innovate every day. In an exciting and evolving industry, we provide potential for career growth through our hands-on training, access to the latest technologies and tools, diversity and inclusion initiatives, opportunities for internal mobility, and professional development budget. 
  • We give back –We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for DE&I organizations, and more.  
  • We listen – We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day. 


Check us out and learn more at https://www.supplyhouse.com/our-company


Additional Details: 

  • Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations.   
  • SupplyHouse.com is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position. 
  • To ensure fairness, all application materials, assessments, and interview responses must reflect your own original work. The use of AI tools, plagiarism, or any uncredited assistance is not permitted at any stage of the hiring process and may result in disqualification. We appreciate your honesty and look forward to seeing your skills. 
  • We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations.
  • All emails from the SupplyHouse team will only be sent from an @supplyhouse.com email address. Please exercise caution if you receive an email from an alternate domain.


What is an Employer of Record (EOR)?

Through our partnership with Remote.com, a global Employer of Record (EOR), you can join SupplyHouse from home, while knowing your employment is handled compliantly and securely. Remote takes care of the behind-the-scenes details – like payroll, benefits, taxes, and local compliance – so you can focus on your work and career growth. Even though Remote manages these administrative functions, you’ll be a part of the SupplyHouse team: connected to our culture, collaborating with colleagues, and contributing to our shared success. This partnership allows us to welcome talented team members worldwide while ensuring you receive a best-in-class employee experience.

Read more
Uphance LLC

at Uphance LLC

2 candid answers
Abhishek Shah
Posted by Abhishek Shah
Remote only
6 - 11 yrs
₹30L - ₹40L / yr
skill iconRuby
skill iconRuby on Rails (ROR)
Enterprise Resource Planning (ERP)
Enterprise architecture
skill iconJavascript
+2 more

We seek a highly skilled and experienced Ruby on Rails Development Team Lead/Architect to join our dynamic team at Uphance. The ideal candidate will have proven expertise in leading and architecting RoR projects, focusing on building scalable, high-quality applications. This role requires a combination of technical leadership, mentorship, and a strong commitment to best practices in software development.


Job Type: Contract/Remote/Full-Time/Long-term


Responsibilities:

  • Develop and maintain high-quality Ruby on Rails applications that meet our high-quality standards.
  • Design, build, and maintain efficient, reusable, and reliable Ruby code.
  • Utilise your expertise in Ruby on Rails to enhance the performance and reliability of our platform.
  • Set the technical direction for the existing RoR project, including system architecture and technology stack decisions.
  • Guide and mentor team members to enhance their technical skills and understanding of RoR best practices.
  • Conduct code reviews to maintain high coding standards and ensure adherence to best practices.
  • Optimise application performance, focusing on ActiveRecord queries and overall architecture.
  • Tackle complex technical challenges and provide efficient solutions, particularly when specifications are unclear or incomplete.
  • Establish and enforce testing protocols; write and guide the team in writing effective tests.
  • Define and ensure consistent adherence to best practices, particularly in the context of large applications.
  • Manage the development process using Agile methodologies, possibly acting as a Scrum Master if required.
  • Work closely with product managers, designers, and other stakeholders to meet project requirements and timelines.


Technical Requirements and Qualifications:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • Proven experience with Ruby on Rails, MySQL, HTML, and JavaScript (6+ years)
  • Extensive experience with Ruby on Rails and familiarity with its best practices
  • Proven track record of technical leadership and team management
  • Strong problem-solving skills and the ability to address issues with incomplete specifications
  • Proficiency in performance optimisation and software testing
  • Experience with Agile development and Scrum practices
  • Excellent mentoring and communication skills
  • Experience with large-scale application development
  • Application performance monitoring/tuning


General Requirements:

  • Availability to work during the IST working hours.
  • High-speed Internet and the ability to join technical video meetings during business hours.
  • Strong analytical and problem-solving skills and ability to work as part of multi-functional teams.
  • Ability to collaborate and be a team player.


Why Uphance?

  • Engage in Innovative Projects: Immerse yourself in cutting-edge projects that not only test your skills but also encourage the exploration of new design realms.
  • AI-Integrated Challenges: Take on projects infused with AI, pushing the boundaries of your abilities and allowing for exploration in uncharted territories of software design and development.
  • Flexible Work Environment: Whether you embrace the digital nomad lifestyle or prefer the comfort of your own space, Uphance provides the freedom to design and create from any corner of the globe.
  • Inclusive Team Environment: Join a dynamic, international, and inclusive team that values and celebrates diverse ideas.
  • Collaborative Team Dynamics: Become a part of a supportive and motivated team that shares in the celebration of collective successes.
  • Recognition and Appreciation: Your accomplishments will be acknowledged and applauded regularly in our Recognition Rally.


Compensation:

Salary Range: INR 24 LPA to INR 32 LPA (Salary is not a constraint for the right candidate)


At Uphance, we value innovation, collaboration, and continuous learning. As part of our team, you'll have the opportunity to lead a group of talented RoR developers, contribute to exciting projects, and play a key role in our company's success. If you are passionate about Ruby on Rails and thrive in a leadership role, we would love to hear from you. Apply today and follow us on LinkedIn - https://www.linkedin.com/company/uphance !

Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Remote only
3 - 7 yrs
₹8L - ₹20L / yr
Google Cloud Platform (GCP)
ETL
skill iconPython
Big Data
SQL
+4 more

Must have skills:

1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java

2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)

5. Data Warehouse concepts - Beginner to Intermediate level


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data

from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical

data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source

applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data

warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,

design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and

data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data

requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into

reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,

quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort