Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Tap Invest
Anusree TP
Posted by Anusree TP
Bengaluru (Bangalore)
1 - 2 yrs
₹3L - ₹5L / yr
SQL
skill iconPython
pandas
skill iconData Analytics
Business Analysis

As an Analyst at Tap Invest, you’ll turn data into decisions. You’ll work with teams across

Product, Ops, Marketing, and Sales to uncover insights, solve real business problems, and

drive strategy.

This role is for someone who is comfortable working with data independently and can

support business teams with reliable analysis and reporting.

Key Responsibilities

● Gather, organize, and clean data from various sources including databases,

spreadsheets, and external sources to ensure accuracy and completeness.

● Write SQL queries to pull, validate, and clean data from production databases.

● Build and maintain dashboards, and generate KPI reports. Track performance

against targets and identify areas for optimization.

● Analyze user funnels and investment patterns to surface actionable insights.

● Prepare and present clear, concise reports and visualizations to communicate

findings and recommendations to stakeholders across teams.

● Document data definitions, metrics, and assumptions clearly for consistency and

reuse.

What We’re Looking For

● 1 to 2 years of experience in Data Analytics, Business Analytics, or a similar role.

● Comfortable writing in SQL and validating queries.

● Solid with Excel / Google Sheets (pivot tables, lookups, charts).

● Genuine curiosity about how businesses use data to make decisions.

● Experience with scripts for data automations.

● Prior projects involving production datasets.

Nice to Have

● Familiarity with pandas or any data manipulation library for advanced automations.

● Interest in capital markets, bonds, fixed income or FinTech.

● Exposure to AI tools

Read more
Tradelab Technologies

at Tradelab Technologies

1 candid answer
Aakanksha Yadav
Posted by Aakanksha Yadav
Bengaluru (Bangalore)
2 - 5 yrs
₹7L - ₹12L / yr
RMS
OMS
Linux/Unix
SQL
API

🚨 We’re Hiring | Support Engineer – Trading & Capital Markets 🚨

📍 Location: Bangalore

🕘 Shift: Night Shift

🎯 Domain Priority: Trading / Capital Markets ONLY


We are looking for a Support Engineer with hands-on experience in trading systems and capital markets, who thrives in fast-paced, high-availability environments and is passionate about client-facing technical support.


Must-Have Qualifications

  • Bachelor’s degree in Computer Science, IT, or a related field
  • 2+ years of experience in Application / Technical Support, preferably in the broking or trading domain
  • Strong understanding of Capital Markets – Equity, F&O, Currency, Commodities
  • Solid technical troubleshooting skills:
  • Linux/Unix
  • SQL
  • Log analysis
  • Familiarity with Trading Systems, RMS, OMS, APIs (REST/FIX), and order lifecycle
  • Excellent communication and interpersonal skills for effective client interaction
  • Ability to work under pressure during live trading hours and manage multiple priorities
  • Customer-centric mindset with strong problem-solving and relationship-building abilities

🔍 Key Responsibilities

  • Act as the primary point of contact for clients reporting issues related to trading applications and platforms
  • Log, track, and monitor incidents using internal tools and ensure resolution within defined TATs
  • Coordinate with Development, QA, Infrastructure, and other internal teams to drive timely resolution
  • Provide clear, proactive, and regular updates to clients and internal stakeholders
  • Maintain detailed logs of incidents, escalations, resolutions, and fixes for audits and future reference
  • Support clients with queries related to system functionality, performance, and usage
  • Communicate proactively with clients regarding product enhancements, features, and updates


⚠️ Note: Candidates with experience in the Trading / Capital Markets domain will be given first priority.

📩 Interested candidates can comment “Interested” or share their resume via DM.

Let’s connect and build reliable trading support systems together!

Read more
Deltek
Shamitha ID
Posted by Shamitha ID
Remote only
10 - 15 yrs
Best in industry
Database architecture
skill icon.NET
Data architecture
SQL
SQL server

Position Responsibilities :


The Database Architect is responsible for the design, optimization, and evolution of the database layer supporting enterprise applications in cloud environments. This role focuses on ensuring efficient data access patterns, scalable query workloads, and robust database architecture capable of supporting high-volume and high-concurrency systems.

This role goes beyond traditional database administration and requires deep technical expertise in SQL query optimization, database internals, and performance diagnostics. The Database Architect analyzes how applications interact with the database and guides improvements in schema design, data access patterns, and system scalability.

As a senior technical leader, the Database Architect helps define long-term strategies for scalable and efficient data architecture while working closely with engineering teams to promote best practices for database design and SQL development.


KEY RESPONSIBILITIES

Database Architecture & Optimization

  • Design and evolve database architectures for scalable enterprise systems.
  • Define efficient data access patterns that support high concurrency and large datasets.
  • Improve schema design, indexing strategies, and query patterns.
  • Ensure database designs support both transactional and data consumption workloads.

SQL Performance Engineering

  • Analyze and optimize complex SQL queries and execution plans.
  • Improve database performance through indexing strategies, statistics management, and query tuning.
  • Investigate workload behaviour and recommend architectural improvements.

Data Access & Systems Thinking

  • Provide guidance on scalable approaches for retrieving and delivering data for data-intensive application features.
  • Recommend architectural strategies such as data aggregation, caching, or pre-computed datasets where appropriate.
  • Apply systems thinking to improve how data is modeled, accessed, and delivered across the application.

Advanced Diagnostics

  • Diagnose database behaviour using tools such as Query Store, Extended Events, and execution plan analysis.
  • Analyze query performance, wait statistics, and workload patterns to identify optimization opportunities.

Collaboration & Technical Leadership

  • Partner with engineering teams to guide scalable SQL development and data access practices.
  • Participate in architecture and design discussions involving database interactions.
  • Document best practices and architectural recommendations.

AI-Assisted Engineering

  • Use AI-assisted tools to accelerate query analysis, diagnostics, and workload investigations.
  • Validate AI-generated insights through empirical testing and database telemetry.


Qualifications :

TECHNICAL SKILLS & EXPERTISE


Database & SQL Server (Required)

  • Advanced SQL Server performance tuning, including query optimization, execution plan analysis, and index design
  • Strong experience diagnosing and resolving deadlocks using Extended Events and deadlock graphs
  • Deep understanding of locking, blocking, and transaction behaviour, including wait statistics and lock escalation
  • Experience optimizing stored procedures, including mitigation of parameter sniffing and plan cache management
  • Strong knowledge of indexing strategies, including covering indexes and filtered indexes
  • Solid understanding of statistics, cardinality estimation, and query optimizer behaviour


Performance Analysis Tools (Required)

  • Experience using SQL Server Profiler and Extended Events for workload diagnostics
  • Advanced execution plan analysis using SSMS or Azure Data Studio
  • Familiarity with SET STATISTICS IO/TIME for query performance evaluation
  • Strong experience using Query Store to analyse query performance and plan behaviour
  • Ability to diagnose issues through wait statistics and blocking chain analysis


Enterprise Application Data Architecture

  • Strong understanding of database design within multi-tier enterprise applications
  • Experience optimizing database workloads supporting high-concurrency systems and large datasets
  • Understanding how application query patterns influence database performance
  • Familiarity working with application platforms such as .NET, APIs, or modern web frameworks


Cloud & Enterprise Database Environments

  • Experience working with cloud-hosted database environments
  • Understanding of scalability considerations in enterprise systems
  • Experience analyzing and optimizing database workloads in production environments


QUALIFICATIONS

  • 8+ years of experience working with enterprise database systems
  • Proven expertise in SQL performance tuning and database workload optimization
  • Strong experience in analysing execution plans and database performance behaviour
  • Experience collaborating with engineering teams on data architecture and query design
  • Strong analytical and problem-solving skills


AI-FIRST MINDSET REQUIREMENT

We value engineers who view AI as a productivity multiplier. The ideal candidate actively leverages AI tools to accelerate diagnostics, analyze database workloads, and uncover optimization opportunities while applying strong engineering judgment to validate results

Read more
WINIT
Aishwarya SURENDRAN
Posted by Aishwarya SURENDRAN
Hyderabad
0 - 2 yrs
₹3L - ₹7L / yr
skill iconReact.js
skill iconReact Native
SQL
AI Coding Tools

WINIT is looking for a Full Stack Developer with expertise in React Native, React.js & Backend


Company Name- WINIT (WINIT Mobile Sales Force Automation | WINIT (winitsoftware.com))

Qualification- Any Graduate

Work experience- 0-2 years

Location- Hyderabad


Job Summary

We are looking for a talented and innovative Full Stack Developer with expertise in React Native, React.js, and backend technologies to join our team as a Vibe Coder. In this role, you’ll develop cutting-edge web and mobile applications while leveraging modern AI tools and best-in-class development practices to enhance productivity, performance, and user experience.

Key Responsibilities

  • Develop and maintain responsive web applications using React.js.
  • Build high-performance cross-platform mobile apps using React Native.
  • Design and develop scalable backend systems using Node.js, Express.js, and related technologies.
  • Integrate third-party services, RESTful APIs, and cloud platforms.
  • Optimize performance across frontend and backend components.
  • Use AI tools to streamline development tasks such as code generation, debugging, testing, and UX improvement.
  • Work collaboratively with product managers, UI/UX designers, and QA teams.
  • Participate in code reviews, contribute to technical documentation, and drive innovation within the development team.

Tech Stack & Tools (Vibe Coder Stack)

  • Frontend: React.js, React Native, Redux, JavaScript, TypeScript, HTML5, CSS3
  • Backend: Node.js, Express.js, REST APIs, Firebase Functions
  • Databases: MongoDB, PostgreSQL, Firebase Firestore
  • Dev Tools: Git, GitHub, VS Code, Postman, Docker, Swagger
  • Cloud & Deployment: AWS, Firebase, Vercel, Netlify
  • CI/CD & PM Tools: GitHub Actions, Trello, Jira, Notion

AI Tools & Utilities

  • GitHub Copilot / Amazon CodeWhisperer – AI pair programming
  • Cursor AI / Gemini / Claude code – Code assistance, debugging, documentation

Requirements

  • 1+ years of hands-on experience in full stack development.
  • Proficient in React Native, React.js, and Node.js.
  • Strong understanding of JavaScript and frontend/backend principles.
  • Experience with cloud-based deployments and mobile app publishing.
  • Familiar with Git-based workflows, API integration, and database systems.
  • Excellent problem-solving, debugging, and communication skills.
  • Ability to independently manage tasks and meet project deadlines.

Preferred (Not Mandatory)

  • Experience with GraphQL, TypeScript, or Next.js.
  • Familiarity with Agile/Scrum methodologies.
  • Exposure to AI/ML concepts or LLM integration in applications.


About our company:


We are an mSFA technology company that has evolved from the industry expertise we have gained over 25+ years. With over 600 success stories in mobility, digitization, and consultation, we are today the leaders in mSFA, with over 75+ Enterprises trusting WINIT mSFA across the globe.

Our state-of-the-art support center provides 24x7 support to our customers worldwide. We continuously strive to help organizations improve their efficiency, effectiveness, market cap, brand recognition, distribution and logistics, regulatory and planogram compliance, and many more through our cutting-edge WINIT mSFA application.

We are committed to enabling our customers to be autonomous with our continuous R&D and improvement in WINIT mSFA. Our application provides customers with machine learning capability so that they can innovate, attain sustainable growth, and become more resilient.

At WINIT, we value diversity, personal and professional growth, and celebrate our global team of passionate individuals who are continuously innovating our technology to help companies tackle real-world problems head-on.



Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Snowflake
Data Transformation Tool (DBT)
SQL
Snow flake schema
skill iconPython
+1 more

JD - 

 

We are looking for a strong Data Engineer having hands on experience in building pipelines using Snowflake and DBT.

Key Responsibilities:

  • Develop, maintain, and optimize data pipelines using DBT and SQL on Snowflake DB.
  • Collaborate with data analysts, QA and business teams to build scalable data models.
  • Implement data transformations, testing, and documentation within the DBT framework.
  • Work on Snowflake for data warehousing tasks, including data ingestion, query optimization, and performance tuning.
  • Use Python (preferred) for automation, scripting, and additional data processing as needed.

Required Skills:

  • 6+ years of experience in building data engineering pipelines.
  • Strong hands-on expertise with DBT and advanced SQL.
  • Experience working with modern columnar/MPP data warehouses, preferably Snowflake.
  • Knowledge of Python for data manipulation and workflow automation (preferred).
  • Good understanding of data modeling concepts, ETL/ELT processes, and best practice.


Read more
Uni Cards

at Uni Cards

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
4yrs+
Upto ₹35L / yr (Varies
)
Product Management
AI Coding Tools
SQL
MS-Excel
Startups
+1 more

Job Title: Product Manager 2

Role Overview

We are looking for a driven and analytical Product Manager to own end-to-end product

initiatives across our fintech ecosystem. You will work closely with Engineering, Growth, Risk,

and Operations teams to build scalable, customer-centric solutions that drive measurable

business impact.

Key Responsibilities

  • Own the product lifecycle from problem discovery → PRD → launch → iteration
  • Write clear and structured PRDs, user stories, and acceptance criteria
  • Collaborate cross-functionally with Engineering, Design, Growth, Risk, and Business
  • teams
  • Define product roadmaps and prioritize based on impact and feasibility
  • Analyze product performance, funnels, and user behavior using data
  • Drive experimentation, A/B testing, and continuous optimization
  • Translate business goals into scalable product solutions

Required Skills & Qualifications

  • 3–7 years of Product Management experience (preferably in fintech/startup environments)
  • Strong analytical skills with hands-on experience in SQL and Excel/Google Sheets
  • Experience working in Agile environments with engineering teams
  • Ability to think structurally and solve complex product problems
  • Strong stakeholder management and communication skills
  • Working understanding of APIs, integrations, and system workflows
  • Comfort using AI tools to enhance productivity, documentation, research, and product discovery
  • Basic understanding of prompt structuring to improve research, analysis, and workflow efficiency
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹28L / yr
SQL
skill iconPython
Informatica
Data Transformation Tool (DBT)

Job Description:

We are looking for a skilled Database Developer with strong hands-on experience in SQL + Informatica and programming knowledge in Java or Python. The ideal candidate will design, develop, and maintain robust ETL pipelines and database solutions while collaborating with cross-functional teams to support business data needs and analytics initiatives. 

Key Responsibilities: 

  • Design, develop, and optimize SQL queries, stored procedures, triggers, and views for high performance and scalability. 
  • Develop and maintain ETL workflows using Informatica PowerCenter (or Informatica Cloud). 
  • Integrate and automate data flows between systems using Java or Python for custom scripts and applications. 
  • Perform data analysis, validation, and troubleshooting to ensure data accuracy and consistency across systems. 
  • Work closely with business analysts, data engineers, and application teams to understand data requirements and translate them into efficient database solutions. 
  • Implement performance tuning, query optimization, and indexing strategies for large datasets. 
  • Maintain data security, compliance, and documentation of ETL and database processes. 

Required Skills & Experience: 

  • Bachelor’s degree in Computer Science, Information Technology, or related field. 
  • 5–8 years of hands-on experience as a SQL Developer or ETL Developer
  • Strong proficiency in SQL (Oracle, SQL Server, or PostgreSQL). 
  • Hands-on experience with Informatica PowerCenter / Informatica Cloud
  • Programming experience in Java or Python (for automation, data integration, or API handling). 
  • Good understanding of data warehousing concepts, ETL best practices, and performance tuning
  • Experience working with version control systems (e.g., Git) and Agile/Scrum methodologies. 

Good to Have: 

  • Exposure to cloud data platforms (AWS, Azure, or GCP). 
  • Familiarity with Unix/Linux scripting
  • Experience in data modeling and data governance frameworks.

 

Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Pune
4 - 7 yrs
₹10L - ₹15L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
NOSQL Databases
+2 more

Proficiency in Java 8+.


Solid understanding of REST APIs(Spring boot), microservices,

databases (SQL/NoSQL), and caching systems like Redis/Aerospike.


Familiarity with cloud platforms (AWS, GCP, Azure) and DevOps tools (Docker, Kubernetes, CI/CD).

Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Chennai, Ahmedabad
4 - 6 yrs
₹8L - ₹15L / yr
ASP.NET
.net core
mvc
skill iconC#
SQL
+13 more

Position: Microsoft .NET Full Stack Developer

Experience: 4–6 Years

Open Positions: 10

Location: PAN India (Final Round – Face-to-Face Interview)

Budget: Up to 15 LPA

Notice Period: Immediate joiners preferred

Key Responsibilities:

· Work on highly distributed and scalable system architecture

· Design, develop, test, and maintain high-quality software solutions

· Ensure performance, security, and maintainability of applications

· Collaborate with cross-functional teams and stakeholders

· Perform system testing and resolve technical issues


Required Skills:

· Strong experience in ASP.NET, C#, .NET Core, MVC

· Hands-on experience with SQL Server / PostgreSQL

· Experience in Angular / React (Frontend technologies)

· Knowledge of microservices architecture & RESTful APIs

· Familiarity with CQRS pattern

· Exposure to AWS / Docker / Kubernetes

· Experience with CI/CD pipelines (Azure DevOps, Jenkins)

· Knowledge of Node.js is an added advantage

· Understanding of Agile methodology

· Good exposure to cybersecurity and compliance


Technology Stack:

· Microsoft .NET technologies (primary)

· Cloud platforms: AWS (SaaS/PaaS/IaaS)

· Databases: MSSQL, MongoDB, PostgreSQL

· Caching: Redis, Memcached

· Messaging queues: RabbitMQ, Kafka, SQS

 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shrutika SaileshKumar
Posted by Shrutika SaileshKumar
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Tableau
SQL

 Role Summary

We are seeking a skilled and business-oriented BI Developer with 4 to 5 years of relevant experience in business intelligence, reporting, and data analytics. This role is ideal for a hands-on analyst who can translate business requirements into impactful dashboards, reports, and actionable insights using Tableau, Power BI, and SQL.

 

The successful candidate will work in a cross-functional environment, partnering with business stakeholders, data teams, and technology teams to support reporting modernization, improve data visibility, and enable better decision-making. This role requires strong technical capability, a solid understanding of data modeling fundamentals, and the ability to communicate insights clearly to both technical and non-technical audiences.

As a BI Developer, you will play a key role in designing, developing, and maintaining reporting and analytics solutions that support operational and strategic business needs. You will be responsible for building intuitive dashboards, analyzing data trends, defining KPIs, and contributing to data-driven process improvement initiatives.

Key Responsibilities

  • Design, develop, and maintain interactive dashboards and reports using Tableau and Power BI.
  • Partner with business users to understand reporting needs, define requirements, and translate them into effective BI solutions.
  • Write, optimize, and maintain SQL queries for data extraction, transformation, validation, and analysis.
  • Perform data analysis to identify trends, anomalies, performance drivers, and opportunities for business improvement.
  • Support KPI definition, metric standardization, and reporting governance across business functions.
  • Work with source data from structured systems and help ensure reporting accuracy and consistency.
  • Contribute to data modeling activities by understanding relationships between datasets, business rules, and reporting logic.
  • Create clear technical and functional documentation including report specifications, data mappings, business logic, and user guides.
  • Collaborate with data engineering, application development, and business teams to support reporting modernization initiatives.
  • Participate in testing activities including unit testing, data validation, user acceptance testing, and post-deployment support.

 

Must-Have Skills & Experience

  • Strong hands-on experience in Tableau and/or Power BI for dashboard development, reporting, and data visualization.
  • Strong SQL skills with experience writing complex queries, joins, aggregations, subqueries, and performance-tuned reporting queries.
  • Solid understanding of BI and analytics concepts including KPIs, trend analysis, scorecards, and management reporting.
  • Good knowledge of data modeling fundamentals, including table relationships, dimensions, facts, hierarchies, and basic star-schema concepts.
  • Experience working with structured datasets and multiple data sources to create meaningful analytical outputs.
  • Ability to gather requirements from business stakeholders and convert them into functional reporting solutions.
  • Strong analytical and problem-solving skills with high attention to detail and data accuracy.
  • Ability to work independently, troubleshoot issues, and document solutions clearly.
  • Effective written and verbal communication skills to collaborate with business, technology, and data teams.

 

Good-to-Have Skills

  • Exposure to Power BI Paginated Reports, DAX, or business process automation use cases.
  • Familiarity with Power Query, Excel advanced features, and data modeling concepts.
  • Understanding of ETL concepts and data pipeline dependencies.
  • Basic experience with Python for data analysis or reporting automation.
  • Exposure to UI/UX principles for dashboard usability and visual storytelling.
  • Familiarity with Agile delivery methodologies and working within sprint-based teams.
  • Experience supporting reporting transformation or modernization programs.

 

Experience Requirements

  • 4 to 5 years of relevant experience in business intelligence, reporting, analytics, or related roles.
  • Demonstrated experience building dashboards and reports in Tableau and Power BI within enterprise or business-facing environments.
  • Proven experience using SQL for data analysis, reporting, and data validation.
  • Experience working directly with business stakeholders to define requirements and deliver reporting solutions.
  • Experience in managing reporting deliverables across the full lifecycle, from requirements gathering through development, testing, deployment, and support. 


Read more
Remote only
3 - 6 yrs
₹3L - ₹6L / yr
skill iconPHP
skill iconNodeJS (Node.js)
Human Resource Management System (HRMS)
Data Structures
SQL
+2 more

Software Developer (HRMS Focus | High-Volume Systems | WFH)

We are looking for a highly skilled software developer with strong expertise in HRMS (Human Resource Management Systems) and proven experience in handling bulk data and high-volume transactions.

The ideal candidate should have hands-on experience in building and scaling HRMS modules such as Payroll, Attendance, Leave Management, and Employee Lifecycle, with solid technical skills in PHP and/or Node.js.


This role offers Permanent Work From Home.

Key Responsibilities

  • Design, develop, and maintain robust HRMS modules (core focus):
  • Payroll processing (large-scale calculations)
  • Attendance & Leave Management
  • Employee lifecycle management
  • Handle bulk data operations (salary processing, employee records, financial data) with performance optimization.
  • Ensure high scalability and performance for systems handling large datasets and concurrent users.
  • Build and optimize REST APIs / GraphQL services.
  • Optimize databases for high-volume transactions and reporting systems.
  • Integrate third-party services (payment gateways, SMS, email, compliance tools).
  • Contribute to additional ERP modules (Education domain as secondary) like Admissions, Fees, LMS, etc.
  • Conduct code reviews and maintain coding standards.

Required Skills & Qualifications

  • Strong experience in:
  • PHP (Core PHP) and/or Node.js
  • Must-have: Deep HRMS expertise and payroll.
  • Attendance systems
  • Leave & policy management
  • Proven experience in handling bulk data / large datasets / high-load systems
  • Strong database skills:
  • MySQL, MongoDB, PostgreSQL (query optimization, indexing, performance tuning)
  • Experience with:
  • REST APIs / GraphQL
  • High-performance backend systems

Good to Have:

  • Experience in Education ERP systems
  • Frontend: JavaScript, React, Vue
  • Tools: Docker, CI/CD pipelines
  • Cloud: AWS / Azure / GCP
  • Experience with enterprise-scale or high-traffic applications

Preferred Experience

  • 3+ years of development experience
  • Minimum 2 years in HRMS development (strongly preferred)
  • Experience managing large-scale employee data and payroll systems


Read more
Oddr

at Oddr

Deepika Madgunki
Posted by Deepika Madgunki
Remote only
2 - 6 yrs
₹1L - ₹18L / yr
ETL
API
Microsoft Windows Azure
Integration
BOOMI
+2 more

Job Title: Integration Engineer


Integration Engineers are responsible for defining, developing, delivering, maintaining and supporting end-to-end Enterprise Integration solutions. Using a designated IPaaS solution (e.g. Boomi), Integration Engineers integrate multiple cloud and on-premise applications which help customers publish and consume data between Oddr and third party systems for a variety of tasks.


Job Summary:

We are seeking a skilled and experienced Integration Engineer to join our Technology team in India. The ideal candidate will have a strong background in implementing low-code/no-code integration platforms as a service (iPaaS), with a preference for experience in Boomi. The role requires an in-depth understanding of SQL and RESTful APIs. Experience with Intapp's Integration Builder is a significant plus.


Key Responsibilities:

- Design and implement integration solutions using iPaaS tools.

- Collaborate with customers, product, engineering and business stakeholders to translate business requirements into robust and scalable integration processes.

- Develop and maintain SQL queries and scripts to facilitate data manipulation and integration.

- Utilize RESTful API design and consumption to ensure seamless data flow between various systems and applications.

- Lead the configuration, deployment, and ongoing management of integration projects.

- Troubleshoot and resolve technical issues related to integration solutions.

- Document integration processes and create user guides for internal and external users.

- Stay current with the latest developments in iPaaS technologies and best practices.


Qualifications:

- Bachelor’s degree in Computer Science, Information Technology, or a related field.

- Minimum of 2 years’ experience in an integration engineering role with hands-on experience in an iPaaS tool, preferably Boomi.

- Proficiency in SQL and experience with database management and data integration patterns.

- Strong understanding of integration patterns and solutions, API design, and cloud-based technologies.

- Good understanding of RESTful APIs and integration.

- Excellent problem-solving and analytical skills.

- Strong communication and interpersonal skills, with the ability to work effectively in a team environment.

- Experience with various integration protocols (REST, SOAP, FTP, etc.) and data formats (JSON, XML, etc.).


Preferred Skills:

- Boomi (or other iPaaS) certifications

- Experience with Intapp's Integration Builder is highly desirable but not mandatory.

- SQL Knowledge is important

- Experience in building E2E integrations and communicating with stakeholders

- Knowledge of Azure Functions, LogicApps, And other Azure Services is highly desirable


What we offer:

- Competitive salary and benefits package.

- Dynamic and innovative work environment.

- Opportunities for professional growth and advancement.

Read more
TalentXO
tabbasum shaikh
Posted by tabbasum shaikh
Gurugram
3 - 6 yrs
₹15L - ₹18L / yr
skill iconElastic Search
OpenSearch
NET
SQL
TypeScript
+1 more

Role & Responsibilities

  • Design, develop, and test new features in the application.
  • Regular communication and collaboration with team members throughoutthe development process.
  • Implement,test, and fix bugs in application features.
  • Participate in fully agile Scrum deliveries as an active team member.
  • Design, build, and maintain efficient and reliable C# and Angular code.

Ideal Candidate

  • Strong full stack software engineer profile
  • Mandatory (Experience): Must have 3+ years of experience as a Fullstack developer
  • Mandatory (Backend): Must have strong backend developement experience in C#, .NET and building RESTful APIs
  • Mandatory (Frontend): Must have hands-on frontend development experience in Angular 14+ and TypeScript/JavaScript
  • Mandatory (Core Skill): Must have working experience in Elasticsearch/OpenSearch (Non-negotiable)
  • Mandatory (DB): Exposure to SQL (Relational DBs)
  • Mandatory (Caching): Must have experience in caching mechanisms (in-memory/shared cache) and database scaling techniques like sharding & replication
  • Mandatory (Authentication): Familiarity with IdentityServer4 and Git
  • Mandatory (Engineering Practices): Must have experience writing unit tests and working in Agile/Scrum environments
  • Mandatory (Architecture Exposure): Candidates should have experience working on microservices architectures, event-driven systems, or distributed systems
  • Mandatory (Company): Product companies
  • Mandatory (Note 2): Please make sure candidate has detailed experience about above skills set in resume
  • Preferred (Skill): Familiarity with deployment processes and packaging libraries for NPM


Read more
TalentXO
Remote only
6 - 10 yrs
₹30L - ₹40L / yr
Agentic AI
Data Product Designer
AI/ML
UX
skill iconFigma
+4 more

Role & Responsibilities

Own the user experience for Dentsu's AI-powered agentic tools and client-facing data products. This is a senior design role responsible for making complex multi-agent systems, Genie spaces, and automated workflows feel simple and intuitive for media teams and clients who are not technical. You will work at the intersection of AI capability and human usability, designing the interfaces that turn powerful backend intelligence into tools people actually want to use.

Key Responsibilities-

  • Lead end-to-end design for agentic AI products: from discovery and user research through wireframes, prototypes, and production-ready specs
  • Design intuitive interfaces for multi-agent systems that serve media planners, analysts, and clients with varying levels of technical sophistication
  • Create UX flows for Genie spaces, conversational data exploration, and automated reporting dashboards that surface insights without requiring SQL or code
  • Develop and maintain a design system for the Decisioning practice's AI product suite, ensuring visual and interaction consistency across all tools
  • Conduct user research with internal media teams and client stakeholders to identify pain points, map workflows, and validate design decisions
  • Design transparency and trust patterns for AI-driven experiences: how users understand what the system did, why, and how to correct it
  • Prototype and test interaction models for agent-to-human handoff, error recovery, and multi-step automated workflows
  • Collaborate closely with AI engineers and data scientists to ensure designs are technically feasible and ship at high fidelity
  • Design onboarding flows and training materials that accelerate adoption of new AI tools across agencies
  • Create client-facing presentation materials, demos, and visual assets that communicate tool capabilities and business value

Ideal Candidate

  • Strong Agentic AI & Data Product Designer Profile
  • Mandatory (Experience 1): Must have 6+ years of total experience in design, with 5+ years in Product Design for data-heavy or complex digital products — enterprise dashboards, analytics tools, workflow platforms, or similar complex environments — with shipped work at scale.
  • Mandatory (Experience 2): Must have 6 months+ experience designing for AI/ML-powered products like gen ai features, agentic ai related features, AI automation tools etc
  • Mandatory (Skill 1): Must have demonstrated expertise in complex workflow design, data visualization, and enterprise UX at scale — designing interfaces that surface insights and enable non-technical users to navigate powerful backend systems
  • Mandatory (Skill 2): Must have strong understanding of design systems and component-based design methodology, with experience building, contributing to, or maintaining systems that ensure visual and interaction consistency across a product suite
  • Mandatory (Skill 3 ): Must have the ability to design transparency and trust patterns for AI-driven experiences — including how users understand what the system did, why, and how to correct it; plus interaction models for agent-to-human handoff, error recovery, and multi-step automated workflows
  • Mandatory (Tools): Must have deep proficiency in Figma, including component libraries, auto-layout, and interactive prototyping
  • Mandatory (Stakeholder Mgmt & Communication): Must have excellent communication skills for presenting design rationale to engineering, product, and business stakeholders
  • Mandatory (Portfolio): Must have a strong portfolio demonstrating complex workflow design, data visualization work, and ideally AI/agentic or conversational interface projects.
  • Preferred (AI Interaction Design): Experience specifically designing chatbot, copilot, or agent-based interaction patterns
  • Preferred (Industry): Experience in media, advertising, or marketing technology industries


Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 12 yrs
Best in industry
skill iconPython
SQL
ETL
Google Cloud Platform (GCP)
Windows Azure
+1 more

We are seeking a skilled Data Engineer to join the AI Platform Capabilities team supporting the UDP Uplift program.

In this role, you will design, build, and test standardized data and AI platform capabilities across a multi-cloud environment (Azure & GCP).

You will collaborate closely with AI use case teams to develop:

  • Scalable data pipelines
  • Reusable data products
  • Foundational data infrastructure

Your work will support advanced AI solutions such as:

  • GenAI
  • RAG (Retrieval-Augmented Generation)
  • Document Intelligence

Key Responsibilities

  • Design and develop scalable ETL/ELT pipelines for AI workloads
  • Build and optimize data pipelines for structured & unstructured data
  • Enable context processing & vector store integrations
  • Support streaming data workflows and batch processing
  • Ensure adherence to enterprise data models, governance, and security standards
  • Collaborate with DataOps, MLOps, Security, and business teams (LBUs)
  • Contribute to data lifecycle management for AI platforms

Required Skills

  • 5–7 years of hands-on experience in Data Engineering
  • Strong expertise in Python and advanced SQL
  • Experience with GCP and/or Azure cloud-native data services
  • Hands-on experience with PySpark / Spark SQL
  • Experience building data pipelines for ML/AI workloads
  • Understanding of CI/CD, Git, and Agile methodologies
  • Knowledge of data quality, governance, and security practices
  • Strong collaboration and stakeholder management skills

Nice-to-Have Skills

  • Experience with Vector Databases / Vector Stores (for RAG pipelines)
  • Familiarity with MLOps / GenAIOps concepts (feature stores, model registries, prompt management)
  • Exposure to Knowledge Graphs / Context Stores / Document Intelligence workflows
  • Experience with DBT (Data Build Tool)
  • Knowledge of Infrastructure-as-Code (Terraform)
  • Experience in multi-cloud deployments (Azure + GCP)
  • Familiarity with event-driven systems (Kafka, Pub/Sub) & API integrations

Ideal Candidate Profile

  • Strong data engineering foundation with AI/ML exposure
  • Experience working in multi-cloud environments
  • Ability to build production-grade, scalable data systems
  • Comfortable working in cross-functional, fast-paced environments
Read more
Bengaluru (Bangalore)
2 - 5 yrs
₹20.4L - ₹24L / yr
skill iconPython
API
SQL
Systems design
Software deployment

Location: Bangalore

Experience: 2–5 years

Type: Full-time | On-site

Open Roles: 2

Start: Immediate

Why this role exists

Most systems work at a low scale.

Very few survive real production load, complex workflows, and enterprise edge cases.

We are building a platform that must:

  • Scale from 500K → 20M+ interactions/month
  • Handle complex insurance workflows reliably
  • Become easier to deploy as it grows, not harder

This role exists to build the backend foundation that makes this possible.

What you’ll do

You will not just write services.

You will design and own core platform systems.

1. Scale the platform without breaking architecture

  • Scale from 50K → 2M+ interactions/month
  • Ensure:
  • High availability
  • Low latency
  • Fault tolerance
  • Avoid large rewrites — build systems that evolve cleanly

2. Build the workflow automation (WA) engine

  • Design a flexible system with:
  • States
  • Stages
  • Cohorts
  • Dynamic workflows
  • Ensure workflows:
  • Handle edge cases reliably
  • Can be configured easily
  • Move from:
  • Hardcoded flows → configurable execution engine

3. Build the insurance-specific data layer

  • Design data models for:
  • Policy states
  • Claim workflows
  • Consent tracking
  • Ensure the system works across:
  • Multiple insurers
  • Multiple use cases
  • Build a platform-first data layer, not use-case-specific hacks

4. Make deployment and setup simple

  • Ensure workflows and data models are:
  • Easy to configure
  • Easy to launch
  • Reduce friction for:
  • Product teams
  • Deployment teams

5. Create a compounding data advantage

  • Build a data layer that:
  • Improves with every deployment
  • Captures structured signals
  • Ensure data becomes a long-term edge, not just storage

6. Own production reliability

  • Participate in on-call rotation across 3 engineers
  • Ensure:
  • Incidents are handled quickly
  • Root causes are fixed permanently
  • Build systems where reliability is shared, not individual

What success looks like

  • Platform scales to 2M+ interactions/month smoothly
  • Workflow engine supports complex, dynamic use cases
  • Data layer enables fast deployment across accounts
  • Edge cases are handled without constant firefighting
  • System becomes easier to use as it grows
  • Production issues are rare and predictable

Who you are

  • You have 2-5 years of backend engineering experience
  • You have built:
  • Scalable systems
  • Distributed services
  • You think in:
  • Systems
  • Data models
  • Trade-offs
  • You are comfortable owning:
  • Architecture
  • Production systems

What will make you stand out

  • Experience building:
  • Workflow engines
  • State machines
  • Data-heavy platforms
  • Strong understanding of:
  • System design
  • Distributed systems
  • Failure handling
  • Experience working in:
  • High-scale production environments

Why join

  • You will build the core backend of an AI platform
  • Your work directly impacts:
  • Scale
  • Reliability
  • Product capability
  • You will design systems that move from:
  • Use-case specific → platform-level infrastructure

What this role is not

  • Not just API development
  • Not limited to feature-level work
  • Not disconnected from production realities

What this role is

  • A system architect
  • A builder of scalable platforms
  • A driver of long-term technical advantage

One question to self-evaluate

Can you design backend systems that scale, handle edge cases, and become easier to use as they grow?


Read more
LearnTube.ai

at LearnTube.ai

2 candid answers
Vinayak Sharan
Posted by Vinayak Sharan
Remote, Mumbai
3 - 6 yrs
₹14L - ₹32L / yr
skill iconPython
FastAPI
skill iconDocker
skill iconAmazon Web Services (AWS)
SQL
+3 more

Role Overview:


As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.


Skip the wait and get noticed faster by completing our AI-powered screening. Click this link to start your quick interview. It only takes a few minutes and could be your shortcut to landing the job! -https://bit.ly/LT_Python


What You'll Do:


At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As a Backend Engineer, your roles and responsibilities will include:

  • Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
  • Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
  • Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
  • Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
  • Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
  • Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
  • Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
  • Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.


What makes you a great fit?


Must-Haves:

  • 3+ yrs Python back-end experience (FastAPI)
  • Strong with Docker & container orchestration
  • Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
  • SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals

Nice-to-Haves

  • k8s at scale, Terraform,
  • Experience with AI/ML inference services (LLMs, vector DBs)
  • Go / Rust for high-perf services
  • Observability: Prometheus, Grafana, OpenTelemetry


About Us: 


At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:

  • AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
  • Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.


Meet the Founders: 


LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.


Why Work With Us? 


At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:

  • Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
  • Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
  • Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
  • Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
  • Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
  • Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.
Read more
SDS softwares

at SDS softwares

2 candid answers
1 recruiter
Tanavee Sharma
Posted by Tanavee Sharma
Remote only
0.6 - 0.8 yrs
₹0.8L - ₹0.9L / yr
Business Analysis
PowerBI
BRD
Tableau
MS-Excel
+5 more

Job Title: Business Analyst (BA)

Job Type: Full-Time | Remote | 5 Days Working

Salary: ₹7,000 – ₹8,000 per month

Experience Required: 6 months to 1 year (Freshers with internship experience can apply)

Joining: Immediate Joiners Only

About the Role:

We are looking for freshers who have strong foundational skills and knowledge in both Business Analysis. This is a position where you will be responsible for manually handling tasks related to both business testing functions.

Key Responsibilities:

  • Gather and analyze business requirements from stakeholders
  • Create documentation such as BRDs, FRDs, user stories, and process flows
  • Perform manual testing of software applications
  • Prepare test cases, test plans, and report bugs clearly
  • Collaborate with development and business teams to ensure product quality and requirement clarity
  • Provide timely updates and reports on progress and findings

Requirements:

  • Must have skills and knowledge in Business Analysis
  • Must be able to manage both roles manually and independently
  • Proficiency in tools related to BA
  • Excellent communication skills in English (spoken and written)
  • Must have a personal laptop and a stable internet connection
  • Must be available to join immediately

Who Should Apply:

  • Freshers with 6 months to 1 year of experience in relevant roles
  • Candidates who are confident in handling BA
  • Individuals looking to build a strong foundation in both domains in a remote, full-time role


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Mumbai, Bengaluru (Bangalore)
4 - 6 yrs
₹3L - ₹11L / yr
skill icon.NET
ASP.NET
skill iconC#
skill iconDocker
Microservices
+1 more

🚀 Hiring: .NET Develoepr at Deqode

⭐ Experience: 4+ Years

📍 Location: Mumbai and Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)



We are looking for a skilled .NET Developer to design and develop scalable microservices and enterprise-grade applications. The role involves building secure REST APIs, writing clean and testable code, working with Docker-based deployments, and collaborating with cross-functional teams.


Key Responsibilities:

  • Develop .NET Core microservices
  • Build and secure REST APIs
  • Write unit & integration tests
  • Deploy applications using Docker
  • Ensure performance optimization and code quality


3 Mandatory Skills

  1. .NET Core / ASP.NET Core Web API
  2. Microservices & Docker
  3. REST API development with Unit Testing





Read more
BigThinkCode Technologies
Kumar AGS
Posted by Kumar AGS
Chennai
4 - 6 yrs
₹1L - ₹13L / yr
SQL
Data modeling
Pipeline management
Apache
Google BigQuery

At BigThinkCode, our technology solves complex problems. We are looking for talented Data engineer to join our Data team at Chennai.

 

Our ideal candidate will have expert knowledge of software development processes, programming, and problem-solving skills. This is an opportunity to join a growing team and make a substantial impact at BigThinkCode Technologies.

 

Please see below our job description, if interested apply / reply sharing your profile to connect and discuss.

 

Company: BigThinkCode Technologies

URL: https://www.bigthinkcode.com/

Work location: Chennai (work from office)

Experience required: 4 - 6 years

Work location: Chennai

Joining time: Immediate – 4 weeks

Work Mode: Work from office (Hybrid)

 

Job Overview:

We are seeking a skilled Data Engineer to design, build, and maintain robust data pipelines and infrastructure. You will play a pivotal role in optimizing data flow, ensuring scalability, and enabling seamless access to structured/unstructured data across the organization. The ideal candidate will design, build, and optimize scalable data pipelines with strong SQL proficiency, data modelling expertise.

Key Responsibilities:

  • Design, develop, and maintain scalable pipelines to process structured and unstructured data.
  • Optimize and manage SQL queries for performance and efficiency in large-scale datasets.
  • Experience working with data warehouse solutions (e.g., Redshift, BigQuery, Snowflake) for analytics and reporting.
  • Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions.
  • Experience in Implementing solutions for streaming data (e.g., Apache Kafka, AWS Kinesis) is preferred but not mandatory.
  • Ensure data quality, governance, and security across pipelines and storage systems.
  • Document architectures, processes, and workflows for clarity and reproducibility.

Required Technical Skills:

  • 4 or more years of experience in Data Engineering file.
  • Expertise in SQL (complex queries, optimization, and database design).
  • Write optimized and production-grade SQL scripts for transformations and data validation.
  • Solid understanding and hands on experience in creating data pipelines and patterns.
  • Proficiency in any programming languages like Python or R for scripting, automation, and pipeline development.
  • Hands-on experience with Google Bigquery and Apache Airflow.
  • Experience working on any cloud-based platforms like AWS or GCP or Azure.
  • Experience working with structured data (RDBMS) and unstructured data (JSON, Parquet, Avro).
  • Familiarity with cloud-based data warehouses (Redshift, BigQuery, Snowflake).
  • Knowledge of version control systems (e.g., Git) and CI/CD practices.

Why Join Us:

·      Collaborative work environment.

·      Exposure to modern tools and scalable application architectures.

·      Medical cover for employee and eligible dependents.

·      Tax beneficial salary structure.

·      Comprehensive leave policy

·      Competency development training programs.

 

 

Read more
Remote, Noida, Gurugram, Pune, Nagpur, Jaipur, Gandhinagar
8 - 14 yrs
₹12L - ₹18L / yr
skill iconPython
SQL
PySpark
databricks
Snow flake schema
+6 more

Senior Data Engineer (Databricks, BigQuery, Snowflake)

Experience: 8+ Years in Data Engineering

Location: Remote | Onsite (Noida, Gurgaon, Pune, Nagpur, Jaipur, Gandhinagar)

Budget: Open / Competitive


Job Summary:

We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable data solutions that support advanced analytics and machine learning initiatives. You will lead the development of reliable, high-performance data systems and collaborate closely with data scientists to enable data-driven decision-making.

In this role, we expect a forward-thinking professional who utilizes AI-augmented development tools (such as Cursor, Windsurf, or GitHub Copilot) to increase engineering velocity and maintain high code standards in a modern enterprise environment.


Key Responsibilities:

  • Scalable Pipelines: Design, develop, and optimize end-to-end data pipelines using SQL, Python, and PySpark.
  • ETL/ELT Workflows: Build and maintain workflows to transform raw data into structured, analytics-ready datasets.
  • ML Integration: Partner with data scientists to deploy and integrate machine learning models into production environments.
  • Cloud Infrastructure: Manage and scale data infrastructure within AWS and Azure ecosystems.
  • Data Warehousing: Utilize Databricks and Snowflake for big data processing and enterprise warehousing.
  • Automation & IaC: Implement workflow orchestration using Apache Airflow and manage infrastructure as code via Terraform.
  • Performance Tuning: Optimize data storage, retrieval, and system performance across data warehouse platforms.
  • Governance & Compliance: Ensure data quality and security using tools like Unity Catalog or Hive Metastore.
  • AI-Augmented Development: Integrate AI tools and LLM APIs into data pipelines and use AI IDEs to streamline debugging and documentation.


Technical Requirements:

  • Experience: 8+ years of core Data Engineering experience in large-scale enterprise or consulting environments.
  • Languages: Expert proficiency in SQL and Python for complex data processing.
  • Big Data: Hands-on experience with PySpark and large-scale distributed computing.
  • Architecture: Strong understanding of ETL frameworks, data pipeline architecture, and data warehousing best practices.
  • Cloud Platforms: Deep working knowledge of AWS and Azure.
  • Modern Tooling: Proven experience with Databricks, Snowflake, and Apache Airflow.
  • Infrastructure: Experience with Terraform or similar IaC tools for scalable deployments.
  • AI Competency: Proficiency in using AI IDEs (Cursor/Windsurf) and integrating AI/ML models into production data flows.


Preferred Qualifications:

  • Exposure to data governance and cataloging tools (e.g., Unity Catalog).
  • Knowledge of performance tuning for massive-scale big data systems.
  • Familiarity with real-time data processing frameworks.
  • Experience in digital transformation and sustainability-focused data projects.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Meghana Shinde
Posted by Meghana Shinde
Pune
8 - 12 yrs
Best in industry
Business Analysis
Risk Management
BRD
FRD
SQL

Job Description: Business Analyst (Capital Markets / Investment Management)

Position Summary

We are seeking an experienced Business Analyst with strong techno-functional expertise in Capital Markets, Investment Banking, Asset Management, and Risk Management. The ideal candidate will have hands-on experience across the full trade lifecycleUAT leadershiprisk frameworksFIX protocol, and digital transformation initiatives. This role requires close collaboration with front, middle, and back-office stakeholders, IT teams, and external vendors to deliver critical business and regulatory solutions.

Key Responsibilities

Business Analysis & Requirements Management

· Lead requirements gathering, documentation (BRD, FRD, User Stories), workflow mapping, and gap analysis.

· Conduct JAD sessions with Trading Desks, Portfolio Managers, Risk, Compliance, and Technology teams.

· Translate business requirements into detailed functional specifications and acceptance criteria.

· Manage and prioritize product backlogs using Agile/Scrum methodologies.

Trade Lifecycle & Capital Markets Expertise

· Support end-to-end trade flows across Equities, Derivatives, Fixed Income, Forex, Options, ETFs, Private Equity, and Structured Products.

· Validate front-to-back trade processes including order placement, execution, allocations, settlement, reconciliation, and reporting.

· Work with OMS/EMS platforms, market connectivity, and brokerage systems.

Risk Management (Market, Model, Liquidity, Credit)

· Analyze VaR, stress testing, scenario analysis, exposure calculations, and liquidity metrics (LCR/NSFR).

· Contribute to market risk policy formulation, governance, and regulatory compliance.

· Identify risk hotspots, process gaps, and control weaknesses with actionable remediation plans.

· Support regulatory reporting including Mark-to-Market and Notional Change requirements.

UAT, QA & Testing Leadership

· Lead end-to-end UAT cycles for trading, risk, and investment applications.

· Create test plans, test cases, and defect logs; track issues through JIRA until closure.

· Perform regression, functional, and production validation testing.

· Coordinate with QA, development teams, and Front Office for seamless deployment.

FIX Protocol & System Integrations

· Gather and validate FIX requirements for OMS/EMS integration.

· Support FIX message mapping, configuration, certification, and UAT.

· Collaborate with brokers, exchanges, and internal development teams for connectivity and workflow enhancements.

Client Management & Onboarding (Buy-side/Sell-side)

· Manage onboarding for clients such as Hedge Funds, Family Offices, Asset Managers, and Prime Brokers.

· Conduct requirement workshops, product demos, trainings, and post-implementation support.

· Serve as the primary point of contact for issue resolution, escalations, and enhancement discussions.

Project & Stakeholder Management

· Drive project plans, milestones, and sprint activities (Planning, Grooming, Stand-ups, Retrospectives).

· Ensure alignment between business needs and technology delivery.

· Prepare executive-level dashboards, presentations, and risk summaries for senior stakeholders.

Skills & Competencies

Technical Skills

· Tools & Platforms: Bloomberg, Refinitiv, FactSet, BlackRock Aladdin, Robinhood, IRIS, Falcon

· Databases: SQL, Excel (advanced), data reconciliation tools

· Project Tools: JIRA, Monday.com, Confluence, MS Visio, Axure

· Risk Systems: VAR models, stress testing tools, exposure monitoring systems

Core Competencies

· Strong stakeholder management & communication

· Business rules analysis & functional documentation

· UI/UX requirement mapping

· Data migration & system integration

· Analytical thinking & problem-solving

· Cross-functional collaboration

Qualifications

· 

9+  years of experience in Capital Markets, Investment Management, and Trading/Risk Systems.

· 

· MBA Finance (preferred) / BBA Finance.

· Certifications:

o NISM – Equity, Derivatives, Options Strategies

o CFI – Fixed Income Fundamentals

o Microsoft – Career Essentials in Business Analysis

o FRM (GARP) – Pursuing

Preferred Experience

· Working on end-to-end trading platform implementations.

· Exposure to Hedge Funds, PMS, AIF, Private Equity, and Wealth Management workflows.

· Knowledge of regulatory frameworks (Basel II–IV, SEBI, Risk Governance).

· Experience authoring policies, SOPs, and process documentation.

Soft Skills

· Excellent verbal and written communication.

· Strong analytical and quantitative capabilities.

· Ability to translate technical concepts to business stakeholders.

· High ownership, deadline orientation, and team collaboration skills.

 

 

Read more
Bengaluru (Bangalore)
5 - 10 yrs
₹1L - ₹10L / yr
databricks
PySpark
Apache Spark
ETL
CI/CD
+10 more

Profile - Databricks Developer

Experience- 5+ years

Location- Bangalore (On site)

PF & BGV is Mandatory


Job Description: -

* Design, build, and optimize data pipelines and ETL/ELT workflows using Databricks and

Apache Spark (PySpark).

* Develop scalable, high performance data solutions using Spark distributed processing.

* Lead engineering initiatives focused on automation, performance tuning, and platform

modernization.

* Implement and manage CI/CD pipelines using Git-based workflows and tools such as

GitHub Actions or Jenkins.

* Collaborate with cross-functional teams to translate business needs into technical

solutions.

* Ensure data quality, governance, and security across all processes.

* Troubleshoot and optimize Spark jobs, Databricks clusters, and workflows.

* Participate in code reviews and develop reusable engineering frameworks.

* Should have knowledge of utilizing AI tools to improve productivity and support daily

engineering activities.

* Strong knowledge and hands-on experience in Databricks Genie, including prompt

engineering, workspace usage, and automation.

Required Skills & Experience:

* 5+ years of experience in Data Engineering or related fields.

* Strong hands-on expertise in Databricks (notebooks, Delta Lake, job orchestration).

* Deep knowledge of Apache Spark (PySpark, Spark SQL, optimization techniques).

* Strong proficiency in Python for data processing, automation, and framework

development.

* Strong proficiency in SQL, including complex queries, performance tuning, and analytical

functions.

* Strong knowledge of Databricks Genie and leveraging it for engineering workflows.

* Strong experience with CI/CD and Git-based development workflows.

* Proficiency in data modeling and ETL/ELT pipeline design.


* Experience with automation frameworks and scheduling tools.

* Solid understanding of distributed systems and big data concepts

Read more
Gradera AI Technologies
Sirisha Jonnada
Posted by Sirisha Jonnada
Hyderabad
4 - 7 yrs
₹20L - ₹50L / yr
skill iconPython
SQL
databricks

Role & Responsibilities

 

·      Collect, clean, and analyze large structured and unstructured datasets from multiple internal and external sources

·      Conduct thorough exploratory data analysis (EDA) to understand data distributions, relationships, outliers, and missing value patterns

·      Profile and audit datasets to assess data quality, completeness, consistency, and fitness for modeling

·      Investigate and document data lineage — understanding where data originates, how it flows, and how it transforms across systems

·      Identify and resolve data anomalies, inconsistencies, and integrity issues in collaboration with data engineering teams

·      Develop a deep understanding of the business domain and the underlying data that represents it — including what each field means, how it is captured, and what its limitations are

·      Translate raw, messy, real-world data into clean, well-understood analytical datasets ready for modeling and reporting

·      Apply statistical techniques such as correlation analysis, hypothesis testing, variance analysis, and distribution fitting to extract meaningful signals from noise

·      Build and deploy machine learning models including regression, classification, clustering, NLP, and time-series analysis

·      Design, evaluate, and analyze A/B experiments and controlled tests using causal inference techniques

·      Develop data-driven recommendations backed by rigorous statistical reasoning

·      Write clean, production-ready code in Python or R

·      Collaborate with data engineers to build reliable data pipelines and feature stores

·      Deploy and monitor ML models using MLOps best practices on cloud infrastructure

·      Build dashboards and self-serve analytics tools to support stakeholder decision-making

 

Data Understanding & Analysis Skills

 

·      Strong ability to interrogate unfamiliar datasets and quickly develop a working understanding of their structure, semantics, and quirks

·      Experience working with messy, incomplete, or poorly documented real-world data

·      Skilled in identifying hidden patterns, trends, seasonality, and anomalies through visual and statistical exploration

·      Ability to ask the right questions about data — challenging assumptions, validating sources, and understanding the context in which data was collected

·      Proficiency in data profiling, descriptive statistics, and summary reporting to communicate the shape and health of a dataset

·      Experience creating data dictionaries, documentation, and data quality reports to support team-wide data understanding

·      Comfort working across structured (relational tables), semi-structured (JSON, XML), and unstructured (text, logs, sensor streams) data formats

 

Technical Skills Required

 

·      Proficiency in Python (pandas, NumPy, scikit-learn, PyTorch or TensorFlow) and/or R

·      Strong SQL skills with hands-on experience in DB2 and SQL Server

·      Experience with Databricks for large-scale data processing, feature engineering, and model training

·      Familiarity with cloud platforms: Azure or AWS

·      Experience with data warehouses and big data platforms (Databricks, Snowflake, or Redshift)

·      Knowledge of MLOps tools such as MLflow, Kubeflow, or Airflow

·      Experience with streaming data technologies such as Kafka or Spark

·      Solid foundation in probability, statistics, linear algebra, and experimental design

 

Nice to Have

 

·      Experience with deep learning, NLP, computer vision, or Bayesian methods

·      Familiarity with real-time or streaming data pipelines

·      Open-source contributions or published research

Read more
Global MNC serving 40+ Fortune 500 Companies

Global MNC serving 40+ Fortune 500 Companies

Agency job
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹26L / yr
Generative AI
Retrieval Augmented Generation (RAG)
skill iconMachine Learning (ML)
LangGraph
langchain
+11 more

Want to work on exciting GenAI projects for Fortune 500 companies across multiple sectors? Then read on..


About Company:

CSG is a multi-national company having a presence in 20 countries with 1600+ Engineers. Company works with more than 40 Fortune 500 customers such as Sony, Samsung, ABB, Thyssenkrup, Toyota, Mitsubishi and many more.


Job Description:

We are looking for a talented Generative AI Developer to join our dynamic AI/ML team. This position offers an exciting opportunity to leverage cutting-edge Generative AI (GenAI) technologies to drive innovation to solve real world problems. You will be responsible for developing and optimizing GenAI-based applications, implementing advanced techniques like Retrieval-Augmented Generation (RAG), RIG (Retrieval Interleaved Generation), Agentic Frameworks and vector databases. This is a collaborative role where you will work directly with customers cross-functional teams to design, implement, and optimize AI-driven solutions. Exposure to cloud-native AI platforms such as Amazon Bedrock and Microsoft Azure OpenAI is highly desirable.


Key Responsibilities

Generative AI Application Development:

Design, develop, and deploy GenAI-driven applications to address complex industrial challenges.

Implement Retrieval-Augmented Generation (RAG) and Agentic frameworks


Data Management & Optimization:

Design and optimize document chunking strategies tailored to specific datasets and use cases.

Build, manage, and optimize data embeddings for high-performance similarity searches across vector databases.


Collaboration & Integration:

Work closely with data engineers and scientists to integrate AI solutions into existing pipelines.

Collaborate with cross-functional teams to ensure seamless AI implementation.


Cloud & AI Platform Utilization:

Explore and implement best practices for utilizing cloud-native AI platforms, such as Amazon Bedrock and Azure OpenAI, to enhance solution delivery.

Continuous Learning & Innovation:

Stay updated with the latest trends and emerging technologies in the GenAI and AI/ML fields, ensuring our solutions remain cutting-edge.


Requirements:

The ideal candidate will have strong experience in Generative AI technologies, particularly in the areas of RAG, document chunking, and vector database management. They will be able to quickly adapt to evolving AI frameworks and leverage cloud-native platforms to create efficient, scalable solutions. You will be working in a fast-paced and collaborative environment, where innovation and the ability to learn and grow are key to success.

- 3 to 5 years of overall experience in software development, with 3 years focused on AI/ML.

- Minimum 2 years of experience specifically working with Generative AI (GenAI) technologies.

- Python, PySpark and SQL knowledge is necessary for tasks

- Proven ability to work in a collaborative, fast-paced, and innovative environment.


Technical Skills:

- Generative AI Frameworks & Technologies:

- Expertise in Generative AI frameworks, including prompt engineering, fine-tuning, and few-shot learning.

- Familiarity with frameworks such as T5 (Text-to-Text Transfer Transformation), LangChain, Lang Graph, Open-source tech stalk Ollama, Mistral, DeepSeek.

- Strong knowledge of Retrieval-Augmented Generation (RAG) for combining LLMs with external data retrieval systems.


Data Management:

- Experience in designing chunking strategies for different datasets.

- Expertise in data embedding techniques and experience with vector databases like Pinecone, ChromaDB etc

- Programming & AI/ML Libraries:

- Strong programming skills in Python.

- Experience with AI/ML libraries such as TensorFlow, PyTorch, and Hugging Face Transformers.


Cloud Platforms & Integration:

- Familiarity with cloud services for AI/ML workloads (AWS, Azure).

- Experience with API integration for AI services and building scalable applications.

- Certifications (Optional but Desirable):

- Certification in AI/ML (e.g., TensorFlow, AWS Certified Machine Learning Specialty).

- Certification or coursework in Generative AI or related technologies.

Read more
Thingularity

Thingularity

Agency job
via Thomasmount Consulting by Shirin Shahana
Bengaluru (Bangalore)
4 - 8 yrs
₹18L - ₹20L / yr
skill iconPython
SQL
ETL

Job Summary

We are seeking a skilled Data Engineer with 4+ years of experience in building scalable data pipelines and working with modern data platforms. The ideal candidate should have strong expertise in Python, SQL, and cloud-based data solutions, with hands-on experience in ETL/ELT processes and data warehousing.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Python
  • Develop and optimize ETL/ELT workflows for data ingestion and transformation
  • Work with structured and unstructured data from multiple sources
  • Build and manage data warehouses/data lakes
  • Perform data validation, cleansing, and quality checks
  • Optimize SQL queries and improve data processing performance
  • Collaborate with data analysts, data scientists, and business teams
  • Implement data governance, security, and best practices
  • Monitor pipelines and troubleshoot production issues

Required Skills

  • Strong programming experience in Python (Pandas, NumPy, PySpark preferred)
  • Excellent SQL skills (joins, window functions, performance tuning)
  • Experience with ETL tools like Informatica, Talend, or DBT
  • Hands-on experience with cloud platforms (Azure / AWS / GCP)
  • Experience in data warehousing solutions like Snowflake, Redshift, BigQuery
  • Knowledge of workflow orchestration tools like Apache Airflow
  • Familiarity with version control tools like Git

Preferred Skills

  • Experience with Big Data technologies (Spark, Hadoop)
  • Knowledge of streaming tools like Kafka
  • Exposure to CI/CD pipelines and DevOps practices
  • Experience in data modeling (Star/Snowflake schema)
  • Understanding of APIs and data integration


Read more
Mumbai, thane, Navi Mumbai
3 - 10 yrs
₹1L - ₹8L / yr
PLC
PLC Scada
SCADA
HMI
Pharmaceutics
+5 more

Engineer – Senior Level

Senior: Ghatkopar

Department: Automation / Programming


About the Opportunity

We are hiring Automation Engineers to work on end-to-end industrial automation projects in pharma

and food processing industries, involving PLC, HMI, and SCADA systems from design to

commissioning.

Qualification

Degree or Diploma in:

 Mechanical Engineering

 Electronics Engineering

 Instrumentation Engineering

 Electrical Engineering

Required Skills & Competencies

 Hands-on experience in PLC, HMI, and SCADA programming

 Knowledge of industrial automation in pharma/process industries

 Basic understanding of electrical & instrumentation wiring

 Ability to read and interpret technical drawings and schematics

 Experience in programming languages such as .NET, VB/VB.Net, SQL/T-SQL (preferred)

 Familiarity with AutoCAD Electrical, EPLAN, or similar tools (added advantage)

 Strong problem-solving and analytical skills

 Good communication and interpersonal skills

 Ability to work independently and within a team

 Flexible to travel and work extended hours when required

Key Responsibilities

 Program, test, and commission industrial control systems

 Select appropriate PLC, HMI, and SCADA systems based on customer URS

 Develop I/O lists as per P&ID and project requirements

 Design and implement control logic for automation projects

 Manage project timelines and ensure timely execution

 Coordinate with project managers on scope changes and updates

 Support FAT (Factory Acceptance Testing) and commissioning activities

 Interpret electrical schematics, wiring diagrams, and P&ID drawings

 Assist in troubleshooting electrical and instrumentation systems

 Ensure smooth project execution through effective coordination

Read more
Source One
Deepali Khandelwal
Posted by Deepali Khandelwal
Pune
0 - 1 yrs
₹25000 - ₹35000 / mo
Python
Java
Jasmine (Javascript Testing Framework)
SQL
Agile testing
+4 more

About the Role

We are looking for curious, technically strong, and product-minded Product Engineer Interns to join our team. This internship offers a unique opportunity to work at the intersection of product thinking and software development, giving you hands-on exposure to the full product lifecycle.

As a Product Engineer Intern, you will collaborate with product and engineering teams to understand customer needs, contribute to feature development, and help deliver impactful product solutions. This role is ideal for students or recent graduates who want to build both technical expertise and product understanding in a fast-paced environment.


Key Responsibilities

Product Understanding (Why & What)

  • Assist in conducting customer and market research to understand user pain points and industry trends
  • Support in translating business needs into user stories and functional requirements
  • Help maintain product documentation and feature requirements
  • Assist in tracking product performance metrics and gathering feedback for improvements
  • Participate in brainstorming sessions for product enhancements

Software Development (How)

  • Support development of product features across web, backend, or internal tools
  • Write clean, maintainable, and efficient code under guidance from senior engineers
  • Participate in testing, debugging, and resolving technical issues
  • Contribute to code reviews and technical discussions
  • Help monitor product performance and support issue resolution


Qualifications & Skills

Required

  • Pursuing or recently completed a Bachelor’s degree in Computer Science, IT, Software Engineering, or related field
  • Strong understanding of programming fundamentals, data structures, and algorithms
  • Knowledge of at least one programming language such as Python, Java, JavaScript, or Go
  • Strong problem-solving and analytical thinking skills
  • Good verbal and written communication skills
  • Eagerness to learn in a fast-paced environment
  • Interest in building products that solve real customer problems

Preferred

  • Familiarity with Git/version control
  • Basic understanding of SQL/NoSQL databases
  • Exposure to cloud platforms like Amazon Web Services, Microsoft Azure, or Google Cloud
  • Understanding of Agile/Scrum methodology
  • Personal, academic, or internship projects demonstrating product thinking


Why Join Us?

  • Hands-on Learning: Work on real product features from day one
  • Mentorship: Learn directly from experienced product and engineering leaders
  • Growth: Build skills in both product management and software engineering
  • Impact: Contribute to solutions that directly improve customer experience
  • Collaborative Culture: Work in a learning-focused, innovative environment


Read more
IDEA ELAN

at IDEA ELAN

1 recruiter
RaginiNaidu Kamineni
Posted by RaginiNaidu Kamineni
Remote only
4.5 - 7.5 yrs
₹15L - ₹20L / yr
ASP.NET
SQL
NOSQL Databases
API
Team Management
+2 more

Backend Developer (4.5 – 7.5 Years Experience)


Company Description:

Idea Elan LLC is a product based company that provides comprehensive software solutionsfor

research facilities in Universities and Institutions worldwide.

Please visit www.IdeaElan.com for more information.


Key Responsibilities:

● Design and develop high-performance,scalable, and secure backend APIs and services

using .NET Core.

● Work withrelational (MS-SQL) andNoSQL (CosmosDB, MongoDB) databases to create

optimized data models and ensure data consistency and performance.

● Participate in code reviews and provide constructive feedback.

● Collaborate with front-end developers and other teams to deliver high-quality software.

● Write clean, maintainable, and efficient code while ensuring quality standards.

● Troubleshoot and debug complex issues, optimizing code for maximum performance and scalability.

● Stay updated with the latest trends in backend development and cloud technologies to drive innovation.

● Optimize database performance and ensure data integrity.


Required Experience:

● 4.5 -7.5 years of experience in backend development.

● Strong experience with .NET Core and building RESTful APIs.

● Proficient with MS-SQL and experience working with NoSQL databases like Cosmos DB and MongoDB.

● Hands-on experience with Azure Cloud services (e.g., Azure Functions, Azure Storage, API Management, Azure SQL Database,etc.).

● Understanding of software development principles such as object-oriented programming (OOP), design patterns, and SOLID principles.

● Experience with version control systems such as Git.

● Strong knowledge of asynchronous programming, microservices architecture, and cloud-native application design.

● Familiarity with CI/CD pipelines, containerization (Docker), and deployment automation is a plus.

● Excellent problem-solving and debugging skills.

● Ability to work in an Agile development environment and collaborate with cross-functional teams.

● Good communication and collaboration skills.

Read more
Remote only
3 - 15 yrs
₹8L - ₹12L / yr
FastAPI
skill iconPython
RESTful APIs
SQL
NOSQL Databases
+5 more

Summary:

We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.

Job Type:

Full-Time Contractor (12 months)

Location:

Remote

Experience:

3+ years in backend development

Key Responsibilities:

  • Design, develop, and maintain robust backend services using Python and FastAPI.
  •  Implement and manage Prisma ORM for database operations.
  • Build scalable APIs and integrate with SQL databases and third-party services.
  • Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
  • Collaborate with front-end developers and other team members to deliver high-quality web applications.
  • Ensure application performance, security, and reliability.
  • Participate in code reviews, testing, and deployment processes.

Required Skills:

  • Expertise in Python backend development with strong experience in FastAPI.
  • Solid understanding of RESTful API design and implementation.
  • Proficiency in SQL databases and ORM tools (preferably Prisma)
  • Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
  • Familiarity with CI/CD pipelines and containerization (Docker).
  • Knowledge of cloud architecture best practices.

Added Advantage:

  • Front-end development knowledge (React, Angular, or similar frameworks).
  • Exposure to AWS/GCP cloud platforms.
  • Experience with NoSQL databases.

Eligibility:

  • Minimum 3 years of professional experience in backend development.
  • Available for full-time engagement.
  • Please excuse if you are currently engaged in other projects—we require dedicated availability.
Read more
ONEPOS RETAIL SOLUTIONS PVT LTD
Durga Prasad C
Posted by Durga Prasad C
Bengaluru (Bangalore), Tirupati
2 - 5 yrs
₹3L - ₹6L / yr
skill iconAndroid Development
Model-View-View-Model (MVVM)
skill iconKotlin
skill iconXML
jetpack
+4 more

About the Role

We are looking for a passionate Native Android Developer with strong expertise in Kotlin and modern Android development. The ideal candidate must have hands-on experience with Coroutines & Flow, along with Jetpack componentsXML UI, and Room Database.

What You’ll Do

  • Develop and maintain Android applications using Kotlin
  • Build responsive UI using XML layouts
  • Implement modern architecture using Jetpack components (ViewModel, StateFlow/Flow, Navigation)
  • Design and manage local databases using Room
  • Build reactive and scalable apps using Coroutines & Flow
  • Work on offline-first architecture and sync strategies
  • Integrate REST APIs and collaborate with backend teams
  • Optimize app performance for low-memory devices
  • Debug, test, and improve application stability

Must-Have Skills

  • Strong experience in Kotlin & Android SDK
  • Mandatory: Coroutines & Flow (StateFlow / SharedFlow)
  • Hands-on experience with:
  • Jetpack (ViewModel, Navigation, WorkManager)
  • XML UI Design
  • Room Database (Entity, DAO, Migration)
  • Solid understanding of MVVM architecture
  • Experience with REST APIs & Git

Good to Have

  • Experience with Jetpack Compose
  • Knowledge of Firebase (Crashlytics, Analytics)
  • Experience in POS / enterprise applications

Qualification

  • Any Graduate.
  • 2- 4 years of experience in Android development.


Regards,

Durga Prasad

Read more
ZakApps software pvt ltd
SindhuPriyaa Arun
Posted by SindhuPriyaa Arun
Chennai
4 - 8 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
Infrastructure management
skill iconKubernetes
+1 more
  • Bachelor’s degree in computer science, Web Development, or a related field (or equivalent practical experience).
  • Minimum of 4 to 8 years of professional experience in Java

development.

  • Strong proficiency in Java and object-oriented programming.
  • Minimum of 4 years of experience in building microservices with Spring Boot.
  • Solid understanding of RESTful APIs and experience with API design and integration.
  • Strong problem-solving skills and the ability to think critically.
Read more
NeoGenCode Technologies Pvt Ltd
Gurugram, Vadodara
4 - 10 yrs
₹6L - ₹16L / yr
skill iconNodeJS (Node.js)
skill iconPython
skill iconReact.js
skill iconNextJs (Next.js)
RESTful APIs
+10 more

Job Title : Full Stack Developer (Crypto Exchange)

Experience : 4+ Years

Location : Gurugram & Vadodara (On-site)


Role Overview :

We are looking for a Full Stack Developer with strong expertise in both backend and frontend development, along with exposure to crypto exchange systems or fintech platforms.

In this role, you will work on building high-performance, real-time trading applications, contributing to core systems like order execution, pricing engines, and wallet integrations.


Key Responsibilities :

  • Design, develop, and maintain scalable backend services and APIs.
  • Build and optimize responsive frontend applications for trading interfaces.
  • Work on real-time systems such as order books, pricing engines, and trade execution.
  • Integrate with blockchain networks, wallets, and third-party APIs.
  • Ensure platform security, performance, and reliability.
  • Collaborate with product, design, and DevOps teams for end-to-end delivery.
  • Participate in system design, architecture discussions, and code reviews.


Required Skills & Qualifications :

  • 4+ years of experience in Full Stack Development.
  • Strong expertise in :
  • Backend : Node.js and/or Python
  • Frontend : React.js and/or Next.js
  • Experience with REST APIs and microservices architecture.
  • Strong understanding of databases (MongoDB, PostgreSQL, MySQL, etc.).
  • Hands-on experience with Docker and cloud platforms (AWS preferred).
  • Solid understanding of system design, scalability, and performance optimization.


Preferred (Good to Have) :

  • Experience working with a crypto exchange or trading platform.
  • Understanding of blockchain fundamentals (Ethereum, Bitcoin, etc.).
  • Experience with wallet integrations and on-chain transactions.
  • Familiarity with WebSockets and real-time data streaming.
  • Knowledge of security best practices in fintech/crypto systems.

Why Join Us ?

  • Opportunity to work on a high-impact, real-world crypto exchange.
  • Build and scale systems from early-stage to production.
  • Work in a fast-paced, ownership-driven environment.
  • Exposure to cutting-edge blockchain and trading technologies.
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dharati Thakkar
Pune
5 - 10 yrs
₹21L - ₹30L / yr
skill iconPython
skill iconMachine Learning (ML)
Generative AI (GenAI)
SQL
skill iconDeep Learning
+11 more

JOB DETAILS:

- Job Title: Lead I - Data Science - Python, Machine Learning, Spark 

- Industry: Global Digital Transformation Solutions Provider

- Experience: 5-10 years

- Job Location: Pune

- CTC Range: Best in Industry

 

JD for Data Scientist

Hands-on experience with data analysis tools:

Proficient in using tools such as Python and R for data manipulation, querying, and analysis.

Skilled in utilizing libraries like Pandas, NumPy, and Scikit-Learn to perform in-depth data analysis and modeling.

 

Skilled in machine learning and predictive analytics:

Expertise in building, training, and deploying machine learning models using frameworks such as TensorFlow and PyTorch.

Capable of performing tasks like regression, classification, clustering, and recommendation, leading to data-driven predictions and insights.

 

Expertise in big data technologies:

Proficient in handling large datasets using big data tools such as Spark.

Skilled in employing distributed computing and parallel processing techniques to ensure efficient data processing, storage, and analysis, enabling enterprise-level solutions and informed decision-making

 

Skills: Python, SQL, Machine Learning, and Deep Learning, with mandatory expertise in Generative AI.

 

Must-Haves

5–9 years of relevant experience in Python, SQL, Machine Learning, and Deep Learning, with mandatory expertise in Generative AI

 

******

NP - Immediate joiners only

Location-Pune 

Read more
Bell Techlogix
Pemmraju VenkatVandita
Posted by Pemmraju VenkatVandita
Hyderabad
5 - 10 yrs
₹15L - ₹20L / yr
Generative AI
Microsoft Windows Azure
skill iconPython
SQL
Windows Azure
+1 more

The AI Data Engineer will be responsible for designing, building, and operating scalable data pipelines and curated data assets that power machine learning, generative AI, and intelligent automation solutions in an SLA-driven managed services environment. This role focuses on data ingestion, transformation, governance, and operational reliability across cloud and hybrid environments enabling use cases such as knowledge retrieval (RAG), conversational AI, predictive analytics, and AI-assisted service management. The ideal candidate combines strong data engineering fundamentals with an understanding of AI workload requirements, including quality, lineage, privacy, and performance. 

 

Key Responsibilities 

•Design, build, and operate production-grade data pipelines that support AI/ML and generative AI workloads in managed services environments 

•Develop curated, analytics-ready datasets and data products to enable model training, grounding, feature generation, and AI search/retrieval 

•Implement data ingestion patterns for structured and unstructured sources (APIs, databases, files, event streams, documents) 

•Build and maintain transformation workflows with strong testing and validation 

•Enable Retrieval-Augmented Generation (RAG) by preparing document corpora, chunking strategies, metadata enrichment, and vector indexing patterns 

•Integrate data pipelines with application services 

•Support ITSM and enterprise workflow data needs, including ServiceNow data integration, CMDB/incident data quality improvements, and automation enablement 

•Implement observability for data pipelines (monitoring, alerting, SLAs/SLOs) and perform root cause analysis for pipeline failures or data quality incidents 

•Apply data governance and security best practices 

•Collaborate with ML Engineers, DevOps/SRE, and solution architects to operationalize end-to-end AI solutions 

•Contribute to reusable patterns, templates, and standards within the Bell Techlogix AI Center of Excellence 

 

Required Qualifications 

•Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent practical experience 

•5+ years of experience in data engineering, analytics engineering, or platform data operations 

•Strong proficiency in SQL and Python; experience with data modeling and dimensional concepts 

•Hands-on experience with Azure data services (e.g., Data Factory, Synapse, Databricks, Storage, Key Vault) or equivalent cloud tooling 

•Experience building reliable pipelines with scheduling, dependency management, and automated testing/validation 

•Experience supporting production data platforms with incident management, troubleshooting, and root cause analysis 

•Understanding of data security, privacy, and governance principles in enterprise environments 

 

Preferred Qualifications 

•Experience enabling AI/ML workloads: feature engineering, training data preparation, and integration with Azure Machine Learning 

•Experience with unstructured data processing for generative AI 

•Familiarity with vector databases or vector search and RAG patterns 

•Experience with event streaming and messaging 

•Familiarity with ServiceNow data model and integration patterns (Table API, export, CMDB/ITSM reporting) 

•Relevant certifications (Microsoft Azure Data Engineer, Azure AI Engineer, Databricks) 

Read more
Remote only
4 - 8 yrs
Best in industry
skill iconPHP
skill iconJavascript
Artificial Intelligence (AI)
Architecture
SQL

Team  -Support Operations — Technical Solutions 

Level  - IC3 (4–7 years of relevant experience) 

Location  - India (Remote) IST time zone, with overlap with US East/Central teams 

Reports To -Technical Manager 

Manages -Not a people-manager role, but a lead role with real technical authority 

Employment Type -Full-time 

 

ABOUT DELTEK 

Deltek is the leading global provider of software and solutions for project-based businesses — serving government contractors, professional services firms, and architecture & engineering companies. Our products help customers manage the full project lifecycle, from winning work and planning resources to executing delivery and getting paid. 

The Support Operations Technical Solutions team sits inside Deltek's Customer Success organization. We build and maintain the internal tooling, integrations, and AI-powered workflows that allow Deltek's support and customer success teams to operate at scale — intelligent case routing, knowledge-base agents, data pipelines between Salesforce, Gainsight, and Oracle Service Cloud, and automation that removes manual work from high-volume support processes. 

 

THE ROLE 

We are looking for a Senior System Engineer to take technical ownership of our most complex solutions. This is not a management role — it is a senior individual contributor role with real architectural authority and a multiplier effect on the team around you. 

You own problems end-to-end. You design the solution before writing the first line, consider downstream impacts before committing to an approach, and hold the technical bar for the work your team delivers. You are the person a junior engineer turns to when they're stuck, and the person a business stakeholder trusts to tell them whether an idea is feasible and what it will cost to maintain. 

In your first year, you can expect to: 

  • Own the end-to-end design and delivery of major integrations and AI-enabled components from architecture through deployment and post-launch stability 
  • Lead solution design for the team's most complex problems using PHP, JavaScript, Workato, APIs, and Web Services 
  • Evaluate technology and platform tradeoffs and make defensible, documented recommendations that balance short-term delivery with long-term maintainability 
  • Apply AI, automation, and agentic architectures to business problems at production scale — not as experiments, but as shipped systems 
  • Anticipate performance, operational, and security risks before they reach production; design with those constraints in mind from day one 
  • Set engineering standards and review the work of IC1/IC2 engineers, making them better through structured feedback and clear design expectations 
  • Partner directly with CS operations leadership and cross-functional stakeholders to translate ambiguous business needs into concrete technical strategies 

 

This role suits an engineer who is past proving they can build things, and is now focused on building the right things in the right way — and helping others do the same. 

 

WHAT WE'RE LOOKING FOR 

Must-Have Technical Skills 

  • PHP and JavaScript: Production depth: You have designed and shipped non-trivial systems in these languages. You understand performance characteristics, know where the footguns are, and write code you'd be comfortable having reviewed by a senior peer. 
  • Integration architecture: You have designed system-to-system integrations — not just consumed APIs. You understand data flow, transformation logic, error handling, retry strategies, and idempotency. 
  • AI / LLM applied experience: You have built or led the build of AI-assisted workflows, LLM-based tools, or agentic systems in an operational or product context. You know the difference between a demo and a production-grade AI system. 
  • Relational databases: Query and schema design: You write optimized SQL, design schemas with long-term maintainability in mind, and understand when a query will cause production problems before it does. 
  • Full-stack troubleshooting at depth: You can diagnose complex, multi-layer issues — across front-end, API, back-end, and database — and trace the root cause without being handed a reproduction case. 
  • Technical tradeoff analysis: When evaluating tools, platforms, or approaches, you can articulate the tradeoffs clearly — not just pick what you know best — and document the rationale in a way that holds up six months later. 
  • Agile technical leadership: You have led technical workstreams in a sprint-based environment: broken down epics, written meaningful acceptance criteria, and been accountable for team delivery quality. 
  • Documentation and design artifacts: You produce architecture diagrams, solution designs, and technical decision records that others can act on — not just notes for yourself. 

Must-Have Leadership & Soft Skills 

  • Technical mentorship: You actively make the engineers around you better. Code reviews are teaching opportunities, not gatekeeping. Design reviews are conversations, not approvals. 
  • Stakeholder communication: You can translate a technical constraint into a business impact, and a business requirement into a technical specification. You don't hide behind jargon or over-simplify to avoid hard conversations. 
  • Ownership under ambiguity: When a problem is poorly defined, you ask the right questions to define it — then own the answer. You don't wait for complete requirements before starting to think. 
  • Proactive risk management: You raise issues before they become incidents. You've learned from production failures and carry those lessons into design decisions. 
  • Business context awareness: You understand how the systems you build affect end users and business operations. You've made engineering decisions informed by that context, not just by technical preference. 

Nice-to-Have Skills 

Prioritized by relevance to this team's current and near-term roadmap: 

Oracle Service Cloud 

Workato / iPaaS 

Salesforce 

Gainsight 

Agentic AI / LLM Ops 

Snowflake 

Microsoft Power BI 

Microsoft Power Apps 

Cloud-native development 

 

Experience designing agentic AI systems — not just integrating LLM APIs — is highly relevant to where this team is going. Candidates who have shipped multi-step agent architectures with tool-calling, memory, and guardrails will stand out. 

 

RESPONSIBILITIES 

Design & Architecture 

  • Own end-to-end technical solution design — from requirements through architecture, implementation, and post-launch stability — for the team's most complex initiatives 
  • Lead solution design using PHP, JavaScript, Workato, APIs, and Web Services; ensure solutions are scalable, maintainable, and aligned with established governance standards 
  • Evaluate tradeoffs across tools, platforms, and architectural patterns; produce documented recommendations that account for both short-term delivery needs and long-term operational cost 
  • Anticipate downstream impacts, performance bottlenecks, and operational risk during the design phase — not as an afterthought in retrospect 
  • Author and maintain Architecture Decision Records (ADRs) and technical design documents for all major solution components 

AI, Automation & Integration 

  • Apply AI, automation, and agentic architectures to complex business problems at production scale — designing for reliability, observability, and graceful failure 
  • Lead the integration of AI-enabled components (LLM workflows, intelligent routing, agentic tools) into the team's operational platform 
  • Design and oversee integrations between Deltek's CS platforms (Oracle Service Cloud, Salesforce, Gainsight) and internal data systems, ensuring data integrity, performance, and auditability 
  • Evaluate new AI frameworks, LLM providers, and automation platforms; provide grounded, implementation-level recommendations rather than theoretical assessments 

Technical Leadership & Mentoring 

  • Serve as the primary technical reviewer for IC1/IC2 engineers — conducting structured code and design reviews that build capability, not just ship code 
  • Break down complex initiatives into well-scoped workstreams that junior engineers can execute with confidence and appropriate independence 
  • Establish and enforce engineering standards: code quality, documentation, testing coverage, deployment practices, and incident response 
  • Identify skill gaps in the team and work with the manager to address them through pairing, documentation, or structured learning 

Stakeholder & Cross-functional Engagement 

  • Translate ambiguous business and operational requirements from CS leadership into concrete technical strategies with clear milestones and measurable outcomes 
  • Engage directly with senior stakeholders — CS operations leads, product owners, IT — to align on priorities, surface risks, and manage technical expectations 
  • Represent the technical perspective of the team in cross-functional planning and architecture discussions 

Operate & Improve 

  • Own post-launch stability of solutions you design: monitor, respond to incidents, and drive root-cause resolution — not just resolution 
  • Drive continuous improvement of the team's delivery practices: identify process friction, propose solutions, and follow through on implementation 
  • Stay current on AI, automation, and integration technology evolution; bring relevant advances back to the team with a concrete point of view on applicability 

 

QUALIFICATIONS 

  • Education: Bachelor's degree in Computer Science, Electrical or Electronics Engineering, or a related technical discipline. Equivalent demonstrated experience considered. 
  • Experience: 4–7 years of hands-on experience in software engineering, systems integration, or closely related work, with at least 2 years at a level where you have owned technical design decisions — not just implemented them. 
  • Coding evidence: A portfolio, GitHub profile, architecture document, or production system you can speak to in depth. At IC3, we expect you to be able to walk through a non-trivial design decision you made and defend the tradeoffs. 
  • AI / ML: Practical, production-level experience with LLMs or AI tooling — not just prompt engineering or personal experimentation. Familiarity with frameworks such as LangChain, OpenAI APIs, or similar platforms is a strong plus. 
  • Collaboration model: Comfortable working as a technical authority in a distributed team. The role requires regular IST overlap with US East/Central stakeholders (approximately 6:30 PM – 10:30 PM IST for at least part of the week). 
  • Language: Strong written and spoken English. At IC3, much of your influence operates through written design documents, async reviews, and stakeholder communications. Precision in writing matters. 

 

 

WHAT TO EXPECT WORKING HERE 

  • Technical authority with real impact — your design decisions ship to production and affect how thousands of Deltek customers experience support 
  • Exposure to production AI/agentic systems and direct involvement in shaping where the team's AI roadmap goes next 
  • A team where senior engineers are trusted to lead, not managed step-by-step — you will have autonomy commensurate with your accountability 
  • Structured growth path: IC3 engineers who demonstrate architectural leadership and cross-functional influence have a clear track toward Staff or Associate Director scope 
  • Regular 1:1s, design review forums, and a manager who will invest in your growth rather than just your output 


Read more
Remote only
1 - 4 yrs
Best in industry
skill iconPHP
skill iconJavascript
AI Coding Tools
Artificial Intelligence (AI)
Large Language Models (LLM) tuning
+1 more

 

Team -Support Operations — Technical Solutions 

Level 

IC2 (1–3 years of relevant experience) 

Location 

India (Remote) — IST time zone, with overlap with US East/Central teams 

Reports To  Tech Manager 

Employment Type  Full-time 

 

ABOUT DELTEK 

Deltek is the leading global provider of software and solutions for project-based businesses, serving government contractors, professional services firms, and architecture & engineering companies. Our products help customers manage the full project lifecycle — from winning work and planning resources to executing delivery and getting paid. 

The Support Operations Technical Solutions team sits inside Deltek's Customer Success organization. We build and maintain the internal tooling, integrations, and AI-powered workflows that enable Deltek's support and customer success teams to operate at scale — think intelligent case routing, knowledge-base agents, data pipelines between Salesforce, Gainsight, and Oracle Service Cloud, and automation that removes manual work from high-volume support processes. 

THE ROLE 

We are looking for a System Engineer (IC2) to join our Technical Solutions team based in India. This is a hands-on engineering role, you will build, integrate, and support the systems that power our customer-facing and internal support operations. 

In your first year, you can expect to: 

  • Build and maintain integrations between support platforms (Oracle Service Cloud, Salesforce, Gainsight) using PHP, JavaScript, and Workato 
  • Contribute to AI-assisted workflow automation — including LLM-based tools and intelligent routing solutions already in production 
  • Write and optimize SQL queries against our operational data stores to power dashboards, reports, and automated triggers 
  • Troubleshoot issues across the full stack: front-end, API layer, back-end logic, and database and document root cause findings 
  • Work in a sprint-based environment alongside engineers, CS operations leads, and product stakeholders across the US and India  

This role is well-suited for someone who is early in their career but already has real project or production experience. You will work with guidance from senior engineers while taking genuine ownership of defined workstreams. The expectation is not that you know everything on day one — it is that you are technically curious, structured in your thinking, and driven to ship things that work. 

 

WHAT WE'RE LOOKING FOR 

Must-Have Technical Skills 

  • PHP and JavaScript: Hands-on experience building or maintaining web applications, APIs, or internal tools. You have written code that went somewhere beyond your laptop. 
  • REST/SOAP APIs and Web Services: You understand how system-to-system data flows work and have built or consumed integrations in a real context. 
  • Relational databases and SQL: You can write optimized queries, understand joins and indexes, and are comfortable reading a schema you didn't design. 
  • Full-stack troubleshooting: When something breaks, you know how to methodically trace the issue across front-end, back-end, and database layers — not just escalate it. 
  • Documentation: You can translate what you built into clear written artifacts — requirements, workflow diagrams, solution designs — that a non-engineer can follow. 
  • Agile/sprint delivery: You have worked in a structured sprint environment and are comfortable with ceremonies, tickets, and incremental delivery. 

Must-Have Soft Skills 

  • Root-cause orientation: You don't patch symptoms and move on. You want to understand why something broke before deciding how to fix it. 
  • Self-driven with good judgment: You can manage your own time on a defined problem, identify when you're stuck and need input, and flag risks before they become blockers. 
  • Clear communicator across audiences: You can explain a technical problem to a non-technical stakeholder and a design decision to a senior engineer — in writing and in a call. 
  • Collaborative: You work well with people you've never met in person, across time zones, and with stakeholders who don't share your technical background. 

Nice-to-Have Skills 

The following are not required for the role, but candidates with depth in any of these areas will stand out. Listed in rough order of relevance to this team's current work: 

 

Oracle Service Cloud 

Workato / iPaaS 

Salesforce 

Gainsight 

AI / LLM integration 

Snowflake 

Microsoft Power BI 

Microsoft Power Apps 

Cloud-native development 

 

Experience with AI tools (GitHub Copilot, LLM APIs, automation agents) used in an operational or product context — not just personal experimentation — is a genuine plus for this team

 

RESPONSIBILITIES 

At the IC2 level, you will primarily execute within defined frameworks and grow your independent scope over time. The following reflects what you will own and contribute to: 

 

Build & Integrate 

  • Build and maintain AI-enabled workflows, platform integrations, and internal tools using PHP, JavaScript, Workato, and Web Services 
  • Develop prototypes and proofs of concept; contribute to production deployments under senior guidance 
  • Implement and test integrations between Deltek's support platforms and internal data systems 

Analyse & Solve 

  • Break down defined problems into actionable tasks; identify risks, dependencies, and edge cases before they surface in production 
  • Troubleshoot complex issues across the full stack and document root cause findings clearly 
  • Investigate stakeholder-reported issues to identify whether the problem is technical, process-related, or both 

 

Operate & Improve 

  • Follow established governance, architecture, and deployment processes; raise improvement suggestions through proper channels 
  • Write and maintain documentation for systems, workflows, business rules, and solution designs 
  • Participate actively in sprint ceremonies; manage your own tasks and flag blockers early 
  • Demonstrate continuous learning in AI, automation, and integration technologies — this space moves fast and curiosity is part of the job 

 

QUALIFICATIONS 

  • Education: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related technical discipline. Equivalent practical experience considered. 
  • Experience: 1–3 years of hands-on experience in software engineering, systems integration, or a closely related field. Internship and co-op experience counts if it involved real production systems. 
  • Coding: Demonstrable PHP and/or JavaScript experience — a portfolio, GitHub profile, or code sample you can speak to will strengthen your application. 
  • Collaboration model: Comfortable working remotely with distributed teams. The role requires regular overlap with US East/Central time zones (approximately 6:30 PM – 10:30 PM IST for at least part of the week). 
  • Language: Strong written and spoken English is essential — much of the collaboration with stakeholders and senior engineers happens asynchronously in writing. 

 

WHAT TO EXPECT WORKING HERE 

  • A small, technically-focused team where your work is visible and your contributions are directly tied to outcomes customers feel 
  • Exposure to production AI/LLM systems, not just theoretical discussions about AI 
  • A culture that values root-cause thinking and good documentation over heroics and quick fixes 
  • Growth path: engineers who demonstrate technical depth and ownership at IC2 have a clear track toward IC3 (mid-level) scope within 18–24 months 
  • Regular 1:1s and structured feedback — this team invests in making you better, not just keeping you busy 


Read more
ACTOSOFT
priyanka sharma
Posted by priyanka sharma
Surat
0 - 1 yrs
₹1.2L - ₹2.5L / yr
DotNetNuke
SQL
ASP.NET MVC

Actosoft is a software developing and company that offers complete IT solutions. We are a part of that

We are a collective of focused, energetic, talented, and hardworking professionals who believe in getting things done at the highest level. Our team aims to innovate, be authentic and grow in everything that we do.

The ideal candidate should be familiar with the complete software design life cycle. In addition, they should have experience in designing, coding, testing and consistently managing applications. They should be comfortable coding in multiple languages and be able to test codes to maintain high-quality coding.

Job details:

 

Job Location: Actosoft, Gajera Rd, beside Avalon Business Hub, Katargam, Surat, Gujarat 395004

Experience: 0 to 1years of experience

Salary: 10,000 to 20,000 (per month)

Job Type: Full-time – Work from Office

Working Schedule:

 

9:00 am to 6:00 pm (Monday to Friday)

Alternate Saturdays Off.

Job Responsibilities:

 

●       Design, code, test, and manage various applications

●       Collaborate with the engineering team and product team to establish the best products

●       Follow outlined standards of quality related to coding and systems

 

●       Develop automated tests and conduct performance tuning

●       Ability to create and support documentation for all new applications

 

●       Willing to work as a team member

Qualifications:

 

●       Bachelor's degree in Computer Science or relevant field, like MCA, BCA, or BE

●       Experience developing web-based applications in C#, HTML, VBScript/ASP, and .NET

●       Experience working with MS SQL Server and MySQL Knowledge of practices and procedures for full software design life cycle

●       Experience in working with an agile development company

 

Required Skills:

●       .NET Framework

●       C#

●       Microsoft SQL Server

●       JavaScript

●       jQuery

●       ASP.NET MVC

●       ASP.NET Web API

●       HTML

●       WCF Services

●       PL/SQL

●       Anqular

●       Entity Framework

●       CSS

●       Ajax

●       XML

 

Perks and Benefits:

1. Evaluation for Bonus and Promotion every year.

2. Incredible opportunity to diversify your writing skills by working with experts on unique projects.

 

Website:

http://www.actosoft.in/

Industry

  • Computer Software

Employment Type

Full-time

Edit job description

 

Read more
Applix

at Applix

3 candid answers
Ariba Khan
Posted by Ariba Khan
Hyderabad
4 - 7 yrs
Upto ₹18L / yr (Varies
)
SQL
Relational Database (RDBMS)
Database Design
Troubleshooting

Job Summary

We are looking for a strong SQL Developer to support a US-based client from our Applix offshore delivery center. This role requires a self-sufficient engineer who can independently manage SQL development, database troubleshooting, data fixes, query optimization, backend support for application changes, and support customizations and implementations tied to business needs.

The ideal candidate should be comfortable working closely with application teams and business stakeholders to understand data flows, support development needs, and resolve production issues with minimal supervision.


Shift:

  • Second shift / US overlap
  • Regular working hours will extend up to 11:30 PM IST on certain business days.


Required Skills

  • Strong hands-on experience in SQL development
  • Strong experience with stored procedures, views, functions, joins, indexing, and performance tuning
  • Good experience in data analysis, troubleshooting, and backend support
  • Ability to write efficient, scalable, and maintainable SQL code
  • Experience supporting production issues and implementing fixes independently
  • Good understanding of database design principles and data integrity
  • Ability to work with application teams on customization and implementation needs
  • Strong communication and problem-solving skills


Preferred Skills

  • Experience supporting ERP applications, preferably manufacturing-related systems
  • Experience with data migration, ETL, reporting, or interface support
  • Exposure to QAD or similar ERP environments
  • Experience in a client-facing offshore support model

 

Key Traits

  • Self-sufficient and dependable
  • Strong analytical mindset
  • Able to independently own issues from analysis to closure
  • Comfortable working in extended overlap with US teams
  • Able to manage priorities with minimal supervision
  • Excellent verbal and written communication skills, with the ability to clearly document findings, explain data/database issues, and provide timely updates to US-based client teams
  • Strong ownership and proactive follow-through, with the ability to independently analyze, troubleshoot, optimize, and close SQL/data-related issues without constant direction


Key Responsibilities

  • Develop, maintain, and optimize SQL queries, stored procedures, functions, views, and backend database objects
  • Support application customizations and implementations through database development and data-level troubleshooting
  • Analyze and resolve production issues related to data, performance, and SQL logic
  • Perform query tuning and performance optimization for existing and new database objects
  • Support data extraction, transformation, validation, and migration activities
  • Work closely with QAD/application teams to support enhancements, integrations, and issue resolution
  • Assist in deployment, testing, and stabilization of new changes
  • Perform root cause analysis for database and data-related issues
  • Maintain technical documentation for database changes, fixes, and support activities
  • Provide reliable offshore support during second shift with timely communication and status updates
Read more
Applix

at Applix

3 candid answers
Ariba Khan
Posted by Ariba Khan
Bangalore, India
3 - 9 yrs
₹20L - ₹37L / yr
Time series
SQL
Neural networks
Snow flake schema
ETL

Responsibilities:

Use quantitative methods such as business simulations, data mining, modeling, and advanced statistical techniques to solve problems. The Data Scientist contributes by serving as a technical lead for analytics initiatives of low‑to‑medium complexity or business impact and supporting high‑profile, enterprise initiatives such as the Engineered Value Chain. 


In this role, you will act as an individual contributor on analytic teams, partnering on cross‑functional projects, and guiding technical delivery. You will also mentor procurement professionals on the technical approaches used to solve problems presented by business units, service organizations, dealers, or customers. 

 

Job duties/Responsibilities include but not limited to:

  • Lead and deliver analytics initiatives by defining analytical approaches, building models, and translating insights into business actions for procurement and enterprise stakeholders. 
  • Develop, train, validate, and monitor predictive models using a broad set of machine learning/statistical methods to support targeted business outcomes. 
  • Design and implement ETL/data pipelines and integrate data sources to create safe, trusted datasets for reporting and analytics (including Snowflake and SQL-based workflows). 
  • Build executive-ready dashboards and decision tools (e.g., Power BI) that enable data‑driven leadership decisions. 
  • Apply data modelling best practices (conceptual, logical, physical models) and support integration/transformation patterns to analytics environments and warehouses. 
  • Partner cross‑functionally (Procurement, Digital/IT, Finance, operations stakeholders) to deploy analytics solutions into production and ensure adoption. 
  • Operate with strong data governance and operational rigor, including troubleshooting data issues, managing access/user needs, and supporting reliable analytics operations. 
  • Use modern engineering practices (e.g., GitLab/DevOps toolchains) to improve repeatability, scalability, and maintainability of analytics solutions. 

 

Must Have Skills

  • Strong AI/ML background across model development and validation, including methods such as time series, clustering, tree-based algorithms, generalized linear models, or neural networks.                           
  • Strong SQL + Snowflake proficiency for ETL, transformation, and analytics-ready datasets. 
  • Experience with cloud solutions, solution integration, IT operations, and data governance. 
  • Proficiency in Python programming language.
  • Proficiency in Prompt Engineering.
  • Experience with front-end technologies such as HTML, CSS, and JavaScript.
  • Experience with back-end technologies such as Django, Flask, or Node.js.
  • Solid grasp of database technologies such as MySQL, PostgreSQL, or MongoDB.
  • Creation of CI,CD Pipelines for ML algorithms, training, prediction pipelines
  • Proficiency in Machine Learning Operations.
  • Work with business, data scientists to ensure value realization, ROI on operationalization of models
  • Strong understanding of statistical analysis and machine learning algorithms.
  • Containerization, packaging of ML models
  • Experience with data visualization tools such as Power BI.
  • Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.
  • Experience with Large Language Models (LLMs) and Natural Language Processing (NLP) technologies.

 

Good To Have Skills:

  • Experience with cloud-first and agile methodologies.
  • Strong understanding of statistical analysis and machine learning algorithms.
  • Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.

 

 

Required Skill:

  • Degree in Computer Science, Business, Mathematics, Economics, Statistics, Engineering, or related field.  
Read more
Hashone Career
Madhavan I
Posted by Madhavan I
Bengaluru (Bangalore)
3 - 8 yrs
₹20L - ₹28L / yr
SQL
skill iconPython
AtScale

Summary:

Data Engineer/Analytics Engineer with experience in semantic layer modeling using AtScale, building scalable data pipelines, and delivering high-performance analytics solutions on cloud platforms.




 Responsibilities

• Build and maintain ETL/ELT pipelines for large-scale data

• Develop semantic models, cubes, and metrics in AtScale

• Optimize query performance and BI dashboards

• Integrate data platforms (Snowflake, Databricks, BigQuery)

• Collaborate with analysts and business teams




 Skills

• SQL, Python/Scala

• Data modeling (star schema, OLAP)

• AtScale (semantic layer)

• Spark, dbt, Airflow

• BI tools (Tableau, Power BI, Looker)

• AWS / GCP / Azure



 Experience

• 3–8+ years in data/analytics engineering

• Experience with enterprise data platforms and BI systems

Read more
ARDEM Incorporated
Remote only
8 - 12 yrs
₹9L - ₹12L / yr
Project delivery
Software Development
Project Management
Team Management
skill icon.NET
+10 more

Senior Project Owner / Project Manager Technology


Department - Technology / Software Development

Work Mode - Work From Home (WFH), Full Time

Experience - Minimum 10 Years (Development Background)

Time Zone - Candidate should be comfortable working in US time zone overlap and attending client calls accordingly.


ROLE SUMMARY

We are looking for a seasoned Senior Project Owner / Project Manager with a strong development foundation to lead our technology initiatives. This role bridges client management and technical execution you will own endto-end delivery of multiple concurrent projects while supporting a high-performing remote team.


KEY RESPONSIBILITIES

Project & Delivery Management

  • Own and manage multiple concurrent technology projects from initiation to production release
  • Define project scope, timelines, milestones, and resource allocation plans
  • Distribute tasks effectively across a team of developers, QA, and support engineers
  • Track assigned work daily, follow up on progress, and proactively remove blockers
  • Ensure all projects meet deadlines and quality benchmarks without compromise
  • Participate actively in production activities and take full accountability for live deployments


US Client Management

  • Serve as the Technology single point of contact for all assigned US clients
  • Attend and lead client calls that are focused on an ARDEM Technical Solution. This may include discussions related to future clients or existing clients (US time zone overlap required)
  • Resolve client queries, manage escalations, and ensure high client satisfaction
  • Showcase company-developed applications and software demos confidently to clients
  • Translate complex client requirements into clear technical deliverables for the team


Team Leadership

  • Lead, mentor, and performance-manage a distributed remote team of technical members
  • Foster accountability, ownership, and a high-delivery culture within the team
  • Conduct sprint planning, stand-ups, retrospectives, and performance reviews
  • Identify skill gaps and work with HR/training teams to bridge them


Process & Operations

  • Deeply understand ARDEM's internal processes and align project execution accordingly
  • Ensure development standards and best practices are followed across all projects
  • Manage crisis situations with composure, identify root causes and drive swift resolution
  • Coordinate with cross-functional teams including HR, Operations, Training, and QA
  • Maintain project documentation, status reports, and risk registers


REQUIRED EXPERIENCE

  • 10+ years of total experience in software development and project management
  • 5–7 years of hands-on coding experience in one or more technologies listed below
  • 2–3 years in a team management or tech lead role overseeing 5+ members
  • Proven experience managing multiple simultaneous projects in a remote/WFH environment
  • Prior experience working with US-based clients strong understanding of US work culture and expectations


TECHNICAL SKILLS

  • Python: scripting, automation, data processing, backend services
  • JavaScript / Node.js: server-side development, REST APIs, async workflows
  • NET Core: enterprise application development and service integration
  • SQL Databases: query optimization, schema design, stored procedures
  • Familiarity with CI/CD pipelines, Git workflows, and deployment processes
  • Ability to review code, understand architectural decisions, and guide the team technically


SKILLS & COMPETENCIES

  • Exceptional verbal and written communication skills in English client-facing confidence is a must
  • Strong crisis management and conflict resolution ability under tight deadlines
  • Highly organized with a structured approach to planning, prioritization, and execution
  • Self-driven and accountable capable of operating independently in a remote environment
  • Strong presentation skills able to demo software to non-technical stakeholders
  • Empathetic leadership style with the ability to motivate and align diverse team members


QUALIFICATIONS

  • Bachelor's or master's degree in computer science
  • PMP Certification: Preferred (candidates without PMP must demonstrate equivalent project management rigor)
  • Agile / Scrum certifications (CSM, PMI-ACP) are an added advantage


LOCATION PREFERENCE

  • Candidates must be based in a Tier-1 city: Mumbai, Delhi NCR, Bengaluru, Hyderabad, Chennai, Pune, or Kolkata
  • This is a full-time Work From Home role: reliable internet, a dedicated workspace, and availability during US business hours are mandatory


ABOUT ARDEM

ARDEM Incorporated is a leading Business Process Outsourcing (BPO) and Automation company serving US based clients across diverse industries. Our Technology Team builds and maintains in-house applications that power data processing pipelines, automation workflows, internal platforms, and domain-specific training modules all engineered to deliver operational excellence at scale. To our clients, we provide cloud-based platforms to assist in their day-to-day business analytics. Our cloud services focus on finance, logistics and utility management.

Read more
BigThinkCode Technologies
Divya Mohandass
Posted by Divya Mohandass
Chennai
4 - 6 yrs
₹7L - ₹16L / yr
SQL
Data engineering
Google BigQuery
Google Cloud Platform (GCP)
Data modeling
+1 more

About the role:

We are looking for a skilled Data Engineer with hands-on expertise in Dagster orchestration or GCP with Bigquery and Apache Airflow, modern data pipeline development, and architecture implementation. The ideal candidate will design, build, and optimize scalable data pipelines with strong SQL proficiency, data modelling expertise.


Key Responsibilities

• Design, develop, and maintain scalable data pipelines using Dagster.

• Build and manage Dagster components such as: o Ops / Assets o Schedules o Sensors o Jobs o Resource definitions

• Implement and maintain Medallion Architecture (Bronze, Silver, Gold layers).

• Write optimized and production-grade SQL scripts for transformations and data validation.

• GCP, Big query, Apache Airflow – expertise is must if not familiar with Dagster and orchestration.


Must Have

• 4+ years of experience in Data Engineering.

• Strong hands-on experience with Dagster (optional) and workflow orchestration.

• Strong hands-on experience with GCP, Big query and Apache Airflow. • Solid understanding of data pipeline design patterns.

• Experience implementing Medallion Architecture.

• Advanced SQL skills (complex joins, CTEs, performance tuning).

• Experience working with GCP cloud data platform.


Why Join Us:

• Collaborative work environment.

• Exposure to modern tools and scalable application architectures.

• Medical cover for employee and eligible dependents.

• Tax beneficial salary structure.

• Comprehensive leave policy

• Competency development training programs.

Read more
Searce Inc

at Searce Inc

3 recruiters
Karthika Senthilkumar
Posted by Karthika Senthilkumar
Coimbatore
7 - 10 yrs
Best in industry
Data engineering
skill iconPython
SQL
Google Cloud Platform (GCP)

Who are we ?


Searce means ‘a fine sieve’ & indicates ‘to refine, to analyze, to improve’. It signifies our way of working: To improve to the finest degree of excellence, ‘solving for better’ every time. Searcians are passionate improvers & solvers who love to question the status quo.


The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday.


Tech Superpowers


End-to-End Ecosystem Thinker: You build modular, reusable data products across ingestion, transformation (ETL/ELT), and consumption layers. You ensure the entire data lifecycle is governed, scalable, and optimized for high-velocity delivery.


The MDS Architect. You reimagine business with the Modern Data Stack (MDS) to deliver Data Mesh implementations and real value. You treat every dataset as a measurable "Data Product with a clear focus on ROI and time-to-insight.


Distributed Compute & Scale Savant: You craft resilient architectures that survive petabyte scale volume and data skew without "breaking the bank. You prove your designs with cost-performance benchmarks, not just slideware.


Al-Ready Orchestrator: You engineer the bridge between structured data and Unstructured/Vector stores. By mastering pipelines for RAG models and GenAl, you turn raw data into the fuel for intelligent, automated workflows.


The Quality Craftsman (Builder @ Heart): You are an outcome-focused leader who lives in the code. From embedding GDPR/PII privacy-by-design to optimizing SQL, Python, and Spark daily, you ensure integrity is baked into every table


Experience & Relevance


Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads


Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.


Al-Native Workflow: You don't just build for Al you build with Al. You must be proficient in using Al coding assistants (e.g.. GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative Al.


Architectural Portfolio: Evidence of leading 2-3 large-scale transformations-including platform migrations, data lakehouse builds, or real-time analytics architectures.


Foster a culture of technical excellence by mentoring and inspiring a team of Data analysts and engineers. Lead deep-dive code reviewa, prompte best-practice data modeling and ensure the squad adopts modern engineering standards like CI/CD For data


Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO's business vision into a Lead Engineer's technical specification without losing anything in translation.


The "Solver" Mindset: A track record of solving 'impossible data problems-whether it's fixing massive data skew, optimizing spiraling cloud costs, or architecting 99.9% available data services.



Read more
Searce Inc

at Searce Inc

3 recruiters
Vaivashhya VN
Posted by Vaivashhya VN
Coimbatore
7 - 10 yrs
Best in industry
Data engineering
Data migration
Datawarehousing
ETL
SQL
+6 more

Who are we ?


Searce means ‘a fine sieve’ & indicates ‘to refine, to analyze, to improve’. It signifies our way of working: To improve to the finest degree of excellence, ‘solving for better’ every time. Searcians are passionate improvers & solvers who love to question the status quo.


The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday.


Tech Superpowers


End-to-End Ecosystem Thinker: You build modular, reusable data products across ingestion, transformation (ETL/ELT), and consumption layers. You ensure the entire data lifecycle is governed, scalable, and optimized for high-velocity delivery.


The MDS Architect. You reimagine business with the Modern Data Stack (MDS) to deliver Data Mesh implementations and real value. You treat every dataset as a measurable "Data Product with a clear focus on ROI and time-to-insight.


Distributed Compute & Scale Savant: You craft resilient architectures that survive petabyte scale volume and data skew without "breaking the bank. You prove your designs with cost-performance benchmarks, not just slideware.


Al-Ready Orchestrator: You engineer the bridge between structured data and Unstructured/Vector stores. By mastering pipelines for RAG models and GenAl, you turn raw data into the fuel for intelligent, automated workflows.


The Quality Craftsman (Builder @ Heart): You are an outcome-focused leader who lives in the code. From embedding GDPR/PII privacy-by-design to optimizing SQL, Python, and Spark daily, you ensure integrity is baked into every table


Experience & Relevance


Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads


Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.


Al-Native Workflow: You don't just build for Al you build with Al. You must be proficient in using Al coding assistants (e.g.. GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative Al.


Architectural Portfolio: Evidence of leading 2-3 large-scale transformations-including platform migrations, data lakehouse builds, or real-time

analytics architectures.


Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO's business vision into a Lead Engineer's technical specification without losing anything in translation.


The "Solver" Mindset: A track record of solving 'impossible data problems-whether it's fixing massive data skew, optimizing spiraling cloud costs, or architecting 99.9% available data services.

Read more
Arcis India
Sarita Jena
Posted by Sarita Jena
Mumbai
6 - 8 yrs
₹12L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Quarkus
Microservices
Webservices
+17 more

6 + years of hands-on development experience and in-depth knowledge of , Spring Java, Spring boot, Quarkus and nice to have front-end technologies like Angular, React JS

● Excellent Engineering skills in designing and implementing scalable solutions

● Good knowledge of CI/CD Pipeline with strong focus on TDD

Strong communication skills and ownership

● Exposure to Cloud, Kubernetes, Docker, Microservices is highly desired.

● Experience in working on public cloud environments like AWS, Azure, GCP w.r.t. solutions development, deployment & adoption of cloud-based technology components like IaaS / PaaS offerings

● Proficiency in PL/SQL and Database development.

Strong in J2EE & OOPS Design Patterns.

Read more
NovacisDigital
Chennai
3 - 8 yrs
₹5L - ₹16L / yr
Relational Database (RDBMS)
Microsoft SQL Server
SQL
dynamic SQL
Stored Procedures
+2 more

Senior Software Engineer – SQL Server / T-SQL

Chennai | IIT Madras Research Park | Full-Time

 

About Novacis Digital

Novacis Digital is a product-first technology company building AI-driven platforms and large-scale data systems. Our products process complex, high-volume data to power real-time analytics and GenAI-driven experiences.

We don’t see SQL as “just a database layer” - we treat it as a core compute engine. If you love writing efficient SQL and solving performance problems, this is the role for you.

 

What You Will Do

·      Design and build complex T-SQL stored procedures involving Dynamic SQL, along with views, functions, and triggers

·      Implement flexible, metadata-driven query frameworks using sp_executesql and parameterized Dynamic SQL

·      Engineer high-performance, set-based queries using CTEs, window functions, temp tables and table variables

·      Optimize queries using execution plans, statistics and DMVs

·      Refactor inefficient queries and redesign schemas for performance and scalability

·      Solve real-world challenges related to locks, blocking, deadlocks and transaction isolation

·      Collaborate with application engineers to build reliable, high-performance data access layers

 

What We’re Looking For

We’re looking for true SQL engineers — people who think in execution flow, logic and data behavior rather than just syntax.

 

You should have:

·      4+ years of deep hands-on experience with Microsoft SQL Server & T-SQL

·      Strong expertise in:

o  Stored Procedures (with Dynamic SQL)

o  Views

o  Functions

o  Triggers

·      Strong experience with:

o  Dynamic SQL best practices and secure execution patterns

o  Indexing strategies and query plan optimization

o  Handling parameter sniffing and plan instability

·      Strong knowledge of:

o  Temp tables vs table variables

o  Cardinality estimation

o  Cost-based optimization concepts


Nice to Have

·      Exposure to GenAI data pipelines or analytical architectures

·      Exposure to Graph, Vector and No SQL Databases

 

How We Work

·      We write production-grade T-SQL

·      We value performance, clarity, and correctness

·      We invest heavily in query readability and maintainability

·      Engineering quality is non-negotiable

 

Apply Now

If you enjoy designing complex Dynamic SQL-powered stored procedures and tuning systems at scale, we’d like to talk.

Read more
Remote only
3 - 6 yrs
₹10L - ₹28L / yr
Python
Selenium
AWS
TestNG
SQL
+2 more

Location: PAN India

💼 Employment Type: Full-Time / Contract

👨‍💻 Experience: 3–6 Years


🔍 Job Overview


We are looking for a talented Automation Test Engineer with strong expertise in Python-based automation, Selenium, and API testing. The ideal candidate will be responsible for building scalable automation frameworks and ensuring high-quality delivery across applications and cloud environments.


🔑 Key Responsibilities

Develop and maintain automation scripts using Python, Selenium, TestNG / Pytest

Perform API testing for RESTful services

Work with AWS services like S3 & API Gateway (basic level)

Conduct database validations using SQL & NoSQL

Integrate automation with CI/CD pipelines (Jenkins, Docker)

Write and maintain test cases, reports, and documentation

Collaborate with cross-functional teams in Agile environments

Debug and resolve automation issues and defects

🛠 Required Skills

Strong experience in Selenium, TestNG / Pytest (Intermediate–Expert)

Proficiency in Python scripting

Experience in RESTful API testing

Knowledge of SQL & NoSQL databases

Hands-on experience with Git (Basic–Intermediate)

Experience with CI/CD tools (Jenkins, Docker)

Basic understanding of AWS (S3, API Gateway)

Scripting knowledge in Shell / Groovy

⭐ Good to Have

Experience in automation framework design

Exposure to cloud-based testing environments



Read more
Pune
3 - 10 yrs
₹1L - ₹10L / yr
skill iconJava
J2EE
API
Java Developer
agile
+15 more

We have an immediate requirement for a Java Developer role in the Pune location. Please find the details below:

Role: Java Developer

Experience: 3–4 Years (Mandatory)

Location: Pune

Joining: Immediate joiners only


Key Responsibilities:

  • Develop and maintain scalable and robust J2EE applications
  • Follow and implement coding standards within the project
  • Integrate with third-party APIs and services
  • Work in an Agile environment to design and implement new features
  • Support team members in resolving technical issues
  • Debug and resolve production issues (code/infrastructure)
  • Communicate effectively with team members and product management

Mandatory Skills:

  • Strong knowledge of Java and JEE internals (Class Loading, Memory Management, Transaction Management, etc.)
  • Expertise in OOPs/OOAD concepts and design patterns
  • Hands-on experience with Spring Framework and Web Services
  • Basic knowledge of JavaScript, jQuery, AJAX, and DOM
  • Good understanding of SQL, relational databases, and ORM (Hibernate/DAO)
  • Strong problem-solving skills and communication abilities

Important Note:

  • Interview is scheduled for Monday
  • Selected candidates are expected to join by Tuesday or Wednesday
Read more
Searce Inc

at Searce Inc

3 recruiters
Srishti Dani
Posted by Srishti Dani
Mumbai, Pune, Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Data migration
Datawarehousing
ETL
SQL
Google Cloud Platform (GCP)
+7 more

Lead Data Engineer


What are we looking for

real solver?

Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.


Your Responsibilities

What you will wake up to solve.

  • Lead Technical Design & Data Architecture: Architect and lead the end-to-end development of scalable, cloud-native data platforms. You’ll guide the squad on critical architectural decisions—choosing between Batch vs. Streaming or ETL vs. ELT—while remaining 100% hands-on, contributing high-quality, production-grade code.
  • Build High-Velocity Data Pipelines: Drive the implementation of robust data transports and ingestion frameworks using Python, SQL, and Spark. You will build integration layers that connect heterogeneous sources (SaaS, RDBMS, NoSQL) into unified, high-availability environments like BigQuery, Snowflake, or Redshift.
  • Mentor & Elevate the Squad: Foster a culture of technical excellence by mentoring and inspiring a team of data analysts and engineers. Lead deep-dive code reviews, promote best-practice data modeling (Star/Snowflake schema), and ensure the squad adopts modern engineering standards like CI/CD for data.
  • Drive AI-Ready Data Strategy: Be the expert in designing data foundations optimized for AI and Machine Learning. You will champion the use of GCP (Dataflow, Pub/Sub, BigQuery) and AWS (Lambda, Glue, EMR) to create "clean room" environments that fuel advanced analytics and generative AI models.
  • Partner with Clients as a Technical DRI: Act as the Directly Responsible Individual for client success. Translate ambiguous business questions into elegant data services, manage project deliverables using Agile methodologies, and ensure that the data provided is accurate, consistent, and mission-critical.
  • Troubleshoot & Optimize for Scale: Own the reliability of the reporting layer. You will proactively monitor pipelines, troubleshoot complex transformation bottlenecks, and propose ways to improve platform performance and cost-efficiency.
  • Innovate and Build Reusable IP: Spearhead the creation of reusable data frameworks, custom operators, and transformation libraries that accelerate future projects and establish Searce’s unique technical advantage in the market.


Welcome to Searce


The AI-Native tech consultancy that's rewriting the rules.

Searce is an AI-native, engineering-led, modern tech consultancy that empowers clients to futurify their business by delivering intelligent, impactful, real business outcomes. Searce solvers co-innovate with clients as their trusted transformational partners ensuring sustained competitive advantage. Searce clients realize smarter, faster, better business outcomes delivered by AI-native Searce solver squads. 


Functional Skills 

the solver personas.

  • The Data Architect: This persona deconstructs ambiguous business goals into scalable, elegant data blueprints. They don't just move data; they design the foundation—from schema design to partitioning strategies—that allows data scientists and analysts to thrive, foreseeing technical bottlenecks and making pragmatic trade-offs.
  • The Player-Coach: As a hands-on leader, this persona leads from the front by writing exemplary, production-grade SQL and Python while simultaneously mentoring and elevating the skills of the squad. Their success is measured by the team's ability to deliver high-quality, maintainable code and their growth as engineers.
  • The Pragmatic Innovator: This individual balances a passion for modern data tech (like Generative AI and Real-time Streaming) with a sharp focus on business outcomes. They champion new tools where they add real value but are disciplined enough to choose stable, cost-effective solutions to meet deadlines and deliver robust products.
  • The Client-Facing Technologist: This persona acts as the crucial technical bridge between the data squad and the client. They build trust by listening actively, explaining complex data concepts (like data latency or idempotency) in simple terms, and demonstrating how engineering decisions align with the client’s strategic goals.
  • The Quality Craftsman: This individual possesses an unwavering commitment to data integrity and treats data engineering as a craft. They are the guardian of the reporting layer, advocating for robust testing, data validation frameworks, and clean, modular code to ensure the long-term reliability of the data platform.


Experience & Relevance 

  • Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads.
  • Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.
  • AI-Native Workflow: You don’t just build for AI; you build with AI. You must be proficient in using AI coding assistants (e.g., GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative AI.
  • Architectural Portfolio: Evidence of leading 2-3 large-scale transformations—including platform migrations, data lakehouse builds, or real-time analytics architectures.
  • Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO’s business vision into a Lead Engineer’s technical specification without losing anything in translation.


Join the ‘real solvers’

ready to futurify?

If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’. Don’t Just Send a Resume. Send a Statement.


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort