Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
learners point.org

at learners point.org

2 candid answers
Partha Sarathy
Posted by Partha Sarathy
Bengaluru (Bangalore)
1 - 8 yrs
₹4L - ₹9.6L / yr
Power BI Desk Top
DAX
Time Intelligency
Data Modelling
power BI Query
+4 more

Power BI Analyst – EdTech (UAE Market)

📍 Location: Bangalore (Onsite)

🕔 Working Days: 5 Days

🏢 Industry: EdTech – Professional Training & Certification Programs

🌍 Market Focus: UAE


About Us – Learners Point


Learners Point Academy is a leading professional training institute in the UAE, empowering working professionals and organizations through globally recognised certification programs such as CMA, PMP, ACCA, CIA, and other corporate training solutions.


With a strong presence in the UAE market, we specialise in career-focused education, enterprise workforce development, and high-impact learning solutions designed to drive measurable professional growth.

As we expand our analytics capabilities, we are looking for a skilled


Power BI Analyst to support business intelligence and data-driven decision-making across our Professional Training Programs.


Role Overview


The Power BI Analyst will be responsible for transforming business, learner, and sales data into actionable dashboards and reports that enhance performance tracking, learner engagement, and revenue optimisation.


Key Responsibilities


  • Design, develop, and maintain interactive dashboards using Microsoft Power BI
  • Develop advanced reports using DAX, data modelling, and Power Query
  • Analyze training program performance (enrollments, retention, completion rates, revenue)
  • Build KPI dashboards for:
  • Sales & Reactivation Team
  • Academic & Training Team
  • Leadership & Management
  • Extract and manage data using SQL from databases and CRM systems
  • Automate reporting processes and ensure data accuracy
  • Translate business requirements into technical BI solutions
  • Present insights through clear and compelling data storytelling


Required Technical Skills


  • Strong experience in Power BI (Desktop & Service)
  • Proficiency in:
  • DAX (Measures, Time Intelligence)
  • Data Modeling (Star & Snowflake Schema)
  • Power Query (ETL)
  • Good knowledge of SQL
  • Advanced Excel (Pivot Tables, Power Pivot, Lookup Functions)
  • Experience integrating data from CRM, LMS, or ERP systems


Industry-Specific Requirements (EdTech Focus)


  • Understanding of:
  • Learner engagement metrics
  • Course completion & drop-off analysis
  • Revenue per program
  • Student retention analytics
  • Experience working with Professional Certification Programs is an added advantage
  • Familiarity with UAE market reporting standards preferred


Preferred Skills


  • Exposure to Azure Data Services
  • Dashboard design best practices
  • Ability to manage large datasets
  • Strong analytical mindset with business understanding


Soft Skills


  • Strong communication & stakeholder management skills
  • Business-oriented thinking
  • Problem-solving mindset
  • Attention to detail


Experience & Qualification


  • Bachelor’s Degree in Computer Science, Data Analytics, Statistics, or related field
  • 2–8 years of experience as a Power BI / Data Analyst (EdTech preferred)
  • UAE or GCC market exposure is a plus


Read more
WINIT
Hyderabad
2 - 5 yrs
₹4L - ₹10L / yr
skill icon.NET
skill iconReact.js
angular
SQL
SAP
+1 more

About the Role:

We are seeking a highly skilled Integration Specialist / Full Stack Developer with strong experience in .NET Core, API integrations, and modern front-end development. The ideal candidate will build and integrate scalable web and mobile applications, manage end-to-end delivery, and ensure smooth data exchange across platforms.


Key Responsibilities:

  • Design, develop, and maintain backend APIs using .NET Core / C#.
  • Build and integrate REST and SOAP-based services (JSON, XML, OAuth2, JWT, API Key).
  • Implement file-based integrations (Flat file, CSV, Excel, XML, JSON) and manage FTP/SFTP transfers.
  • Work with databases such as MSSQL, PostgreSQL, Oracle, and SQLite — including writing queries, stored procedures, and using ADO.NET.
  • Handle data serialization/deserialization using Newtonsoft.Json or System.Text.Json.
  • Implement robust error handling and logging with Serilog, NLog, or log4net.
  • Automate and schedule processes using Quartz.NET, Hangfire, or Windows Task Scheduler.
  • Manage version control and CI/CD pipelines via Git and Azure DevOps.
  • Develop front-end interfaces with React and React Native ensuring responsive, modular UI.
  • Implement offline-first functionality for mobile apps (sync logic, caching, etc.).
  • Collaborate with cross-functional teams or independently handle full project ownership.
  • Utilize AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code) to enhance productivity.
  • Apply integration best practices including middleware, API gateways, and optionally message queues (MSMQ, RabbitMQ).
  • Ensure scalability, security, and performance in all deliverables.


Key Skills & Technologies:

  • Backend: .NET Core, C#, REST/SOAP APIs, WCF, ADO.NET
  • Frontend: React, React Native
  • Databases: MSSQL, PostgreSQL, Oracle, SQLite
  • Tools: Git, Azure DevOps, Hangfire, Quartz.NET, Serilog/NLog
  • Integration: JSON, XML, CSV, FTP/SFTP, OAuth2, JWT
  • DevOps: CI/CD automation, deployment pipelines
  • Optional: Middleware, API Gateway, Message Queues (MSMQ, RabbitMQ)


Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Minimum 5 years of hands-on experience in software development and integration.
  • Proven expertise in designing and implementing scalable applications.
  • Strong analytical and problem-solving skills with a proactive approach.


Nice to Have:

  • Experience with cloud services (Azure, AWS, GCP).
  • Knowledge of containerization tools like Docker or Kubernetes.
  • Familiarity with mobile deployment workflows and app store publishing.
Read more
Kanerika Software

at Kanerika Software

3 candid answers
2 recruiters
Ariba Khan
Posted by Ariba Khan
Hyderabad, Indore, Ahmedabad
7 - 10 yrs
Upto ₹35L / yr (Varies
)
Snow flake schema
skill iconPython
SQL
databricks
PySpark

About Kanerika:

Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.


We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.


Awards and Recognitions:

Kanerika has won several awards over the years, including:

1. Best Place to Work 2023 by Great Place to Work®

2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today

3. NASSCOM Emerge 50 Award in 2014

4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture

5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.


Working for us:

Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.


Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.


Role Responsibilities: 

Following are high level responsibilities that you will play but not limited to: 

  • Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
  • Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
  • Enable business analytics and self-service reporting through Power BI and other visualization tools.
  • Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
  • Implement and enforce best practices for data governance, data quality, and security.
  • Mentor and guide junior data engineers; establish coding and design standards.
  • Evaluate emerging technologies and tools to continuously improve the data ecosystem.


Required Qualifications:

  • Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
  • Bachelor’s/ Master’s degree in Computer Science, Information Technology, Engineering, or related field.
  • 7-10 years of experience in data engineering or data platform development
  • Strong hands-on experience in SQL, Snowflake, Python, and Airflow
  • Solid understanding of data modeling, data governance, security, and CI/CD practices.

Preferred Qualifications:

  • Experience in leading a team
  • Familiarity with data modeling techniques and practices for Power BI.
  • Knowledge of Azure Databricks or other data processing frameworks.
  • Knowledge of Microsoft Fabric or other Cloud Platforms.


What we need?

· B. Tech computer science or equivalent.


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Employee Benefits:

1. Culture:

  • Open Door Policy: Encourages open communication and accessibility to management.
  • Open Office Floor Plan: Fosters a collaborative and interactive work environment.
  • Flexible Working Hours: Allows employees to have flexibility in their work schedules.
  • Employee Referral Bonus: Rewards employees for referring qualified candidates.
  • Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.


2. Inclusivity and Diversity:

  • Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
  • Mandatory POSH training: Promotes a safe and respectful work environment.


3. Health Insurance and Wellness Benefits:

  • GMC and Term Insurance: Offers medical coverage and financial protection.
  • Health Insurance: Provides coverage for medical expenses.
  • Disability Insurance: Offers financial support in case of disability.


4. Child Care & Parental Leave Benefits:

  • Company-sponsored family events: Creates opportunities for employees and their families to bond.
  • Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
  • Family Medical Leave: Offers leave for employees to take care of family members' medical needs.


5. Perks and Time-Off Benefits:

  • Company-sponsored outings: Organizes recreational activities for employees.
  • Gratuity: Provides a monetary benefit as a token of appreciation.
  • Provident Fund: Helps employees save for retirement.
  • Generous PTO: Offers more than the industry standard for paid time off.
  • Paid sick days: Allows employees to take paid time off when they are unwell.
  • Paid holidays: Gives employees paid time off for designated holidays.
  • Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.


6. Professional Development Benefits:

  • L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
  • Mentorship Program: Offers guidance and support from experienced professionals.
  • Job Training: Provides training to enhance job-related skills.
  • Professional Certification Reimbursements: Assists employees in obtaining professional   certifications.
  • Promote from Within: Encourages internal growth and advancement opportunities.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Pune, Mumbai, Bengaluru (Bangalore)
3 - 12 yrs
Best in industry
skill iconPython
pandas
Object Oriented Programming (OOPs)
SQL

JOB DESCRIPTION:


Location: Pune, Mumbai, Bangalore

Mode of Work : 3 days from Office


* Python : Strong expertise in data workflows and automation

* Pandas: For detailed data analysis and validation

* SQL: Querying and performing operations on Delta tables

* AWS Cloud: Compute and storage services

* OOPS concepts

Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹30L - ₹40L / yr
Backend devlopment
skill iconPython
skill iconJava
SQL

Strong Senior Backend Engineer profiles

Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems

Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).

Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework

Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization

Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices

Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)

Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D

Mandatory (Education) – Candidates from Tier - 1 engineering institutes (IITs, BITS, are highly preferred)

Read more
Remote only
0 - 1 yrs
₹1L - ₹1.5L / yr
skill iconPHP
SQL
Databases
skill iconAmazon Web Services (AWS)
Relational Database (RDBMS)
+4 more

Qualification- BTech-CS (2025 graduate only)

Joining: Immediate Joiner

Job Type: Trainee

Work Mode: Remote

Working Days: Monday to Friday

Shift (Rotational – based on project need):

·      5:00 PM – 2:00 AM IST

·      6:00 PM – 3:00 AM IST

 

Job Summary

ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.

 

Eligibility & Qualifications

  • Education:
  • B.Tech (Computer Science) / M.Tech (Computer Science)
  • Tier 1 colleges preferred
  • Experience Level: Fresher
  • Communication: Excellent English communication skills (verbal & written)

Skills Required

Technical & Development Skills:

·       Basic understanding of AI / Machine Learning concepts

·       Exposure to AWS (deployment or cloud fundamentals)

·       PHP development

·       WordPress development and customization

·       JavaScript (ES5 / ES6+)

·       jQuery

·       AJAX calls and asynchronous handling

·       Event handling

·       HTML5 & CSS3

·       Client-side form validation

 

Work Environment & Tools

  • Comfortable working in a remote setup
  • Familiarity with collaboration and remote access tools

 

Additional Requirements (Work-from-Home Setup)

This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:

  • System: Laptop or Desktop (Windows-based)
  • Operating System: Windows
  • Screen Size: Minimum 14 inches
  • Screen Resolution: Full HD (1920 × 1080)
  • Processor: Intel i5 or higher
  • RAM: Minimum 8 GB (Mandatory)
  • Software: AnyDesk
  • Internet Speed: 100 Mbps or higher

 

About ARDEM

 

ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.

 



Read more
Hyderabad
4 - 8 yrs
₹20L - ₹30L / yr
Generative AI
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
+8 more

We are seeking a talented AI/ML Engineer with strong hands-on experience in Generative AI and Large Language Models (LLMs) to join our Business Intelligence team. The role involves designing, developing, and deploying advanced AI/ML and GenAI-driven solutions to unlock business insights and enhance data-driven decision-making.


Key Responsibilities:

• Collaborate with business analysts and stakeholders to identify AI/ML and Generative AI use cases.

• Design and implement ML models for predictive analytics, segmentation, anomaly detection, and forecasting.

• Develop and deploy Generative AI solutions using LLMs (GPT, LLaMA, Mistral, etc.).

• Build and maintain Retrieval-Augmented Generation (RAG) pipelines and semantic search systems.

• Work with vector databases (FAISS, Pinecone, ChromaDB) for embedding storage and retrieval.

• Develop end-to-end AI/ML pipelines from data preprocessing to deployment.

• Integrate AI/ML and GenAI solutions into BI dashboards and reporting tools.

• Optimize models for performance, scalability, and reliability.

• Maintain documentation and promote knowledge sharing within the team.


Mandatory Requirements:

• 4+ years of relevant experience as an AI/ML Engineer.

• Hands-on experience in Generative AI and Large Language Models (LLMs) – Mandatory.

• Experience implementing RAG pipelines and prompt engineering techniques.

• Strong programming skills in Python.

• Experience with ML frameworks (TensorFlow, PyTorch, scikit-learn).

• Experience with vector databases (FAISS, Pinecone, ChromaDB).

• Strong understanding of SQL and database systems.

• Experience integrating AI solutions into BI tools (Power BI, Tableau).

• Strong analytical, problem-solving, and communication skills. Good to Have

• Experience with cloud platforms (AWS, Azure, GCP).

• Experience with Docker or Kubernetes.

• Exposure to NLP, computer vision, or deep learning use cases.

• Experience in MLOps and CI/CD pipelines

Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 10 yrs
₹6L - ₹15L / yr
skill iconData Analytics
skill iconData Science
SQL
Tableau
OpenAI
+1 more

Job description Data Analyst

 

About Miror

Miror is India’s leading FemTech platform transforming how women experience peri-menopause and menopause. In just a year, we’ve built India’s largest menopause-focused WhatsApp community, partnered with the National Health Mission and the Indian Menopause Society, and launched category-defining nutraceutical products and digital health services. Our app blends science and technology—offering personalized care pathways, symptom tracking, diagnostic links, games, AI-powered chat, expert consultations, and more. We're proud recipients of the Innovation in Menopause Care award at the Global Women’s Health Innovation Conference 2024 and are rapidly scaling toward our $1B+ vision. Learn more: miror.in

 

Role Overview

We’re looking for a Data Analyst who is excited to work at the intersection of data, technology, and women’s wellness. You'll be instrumental in helping us understand user behaviour, community engagement, campaign performance, and product usage across platforms — including app, web, and WhatsApp.

You’ll also have opportunities to collaborate on AI-powered features such as chatbots and personalized recommendations. Experience with GenAI or NLP is a plus but not a requirement.

 

Key Responsibilities

·       Clean, transform, and analyse data from multiple sources (SQL databases, CSVs, APIs).

·       Build dashboards and reports to track KPIs, user behaviour, and marketing performance.

·       Collaborate with product, marketing, and customer teams to uncover actionable insights.

·       Support experiments, A/B testing, and cohort analysis to drive growth and retention.

·       Assist in documentation and communication of findings to technical and non-technical teams.

·       Work with the data team to enhance personalization and AI features (optional).

 

Required Qualifications

·       Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.

·       2 – 4 years of experience in data analysis or business intelligence.

·       Strong hands-on experience with SQL and Python (pandas, NumPy, matplotlib).

·       Familiarity with data visualization tools (Streamlit, Tableau, Metabase, Power BI, etc.)

·       Ability to translate complex data into simple visual stories and clear recommendations.

·       Strong attention to detail and a mindset for experimentation.

 

Preferred (Not Mandatory)

·       Exposure to GenAI, LLMs (e.g., OpenAI, HuggingFace), or NLP concepts.

·       Experience working with healthcare, wellness, or e-commerce datasets.

·       Familiarity with REST APIs, JSON structures, or chatbot systems.

·       Interest in building tools that impact women’s health and wellness.

 

Why Join Us?

·       Be part of a high-growth startup tackling a real need in women’s healthcare.

·       Work with a passionate, purpose-driven team.

·       Opportunity to grow into GenAI/ML-focused roles as we scale.

·       Competitive salary and career progression

 

 

Best Regards,

Indrani Dutta

MIROR THERAPEUTICS PRIVATE LIMITED

Connect with me here or on my LinkedIn page.

Read more
Optimo Capital

at Optimo Capital

2 candid answers
Shantanu Palwe
Posted by Shantanu Palwe
Bengaluru (Bangalore)
0 - 3 yrs
₹20000 - ₹30000 / mo
MS-Excel
SQL
skill iconPython
pandas
skill iconData Analytics
+1 more

Job Description: Data Analyst Intern


Location: On-site, Bangalore

Duration: 6 months (Full-time)


About us:


  • Optimo Capital is a newly established NBFC founded by Prashant Pitti, who is also a co-founder of EaseMyTrip (a billion-dollar listed startup that grew profitably without any funding).
  • Our mission is to serve the underserved MSME businesses with their credit needs in India. With less than 15% of MSMEs having access to formal credit, we aim to bridge this credit gap through a phygital model (physical branches + digital decision-making). As a technology and data-first company, tech lovers and data enthusiasts play a crucial role in building the analytics & tech at Optimo that helps the company thrive.


What we offer:


  • Join our dynamic startup team and play a crucial role in core data analytics projects involving credit risk, lending strategy, credit features analytics, collections, and portfolio management.
  • The analytics team at Optimo works closely with the Credit & Risk departments, helping them make data-backed decisions.
  • This is an exceptional opportunity to learn, grow, and make a significant impact in a fast-paced startup environment.
  • We believe that the freedom and accountability to make decisions in analytics and technology brings out the best in you and helps us build the best for the company.
  • This environment offers you a steep learning curve and an opportunity to experience the direct impact of your analytics contributions. Along with this, we offer industry-standard compensation.


What we look for:


  • We are looking for individuals with a strong analytical mindset, high levels of initiative / ownership, ability to drive tasks independently, clear communication and comfort working across teams.
  • We value not only your skills but also your attitude and hunger to learn, grow, lead, and thrive, both individually and as part of a team.
  • We encourage you to take on challenges, bring in new ideas, implement them, and build the best analytics systems.


Key Responsibilities:

  • Conduct analytical deep-dives such as funnel analysis, cohort tracking, branch-wise performance reviews, TAT analysis, portfolio diagnostic, credit risk analytics that lead to clear actions.
  • Work closely with stakeholders to convert business questions into measurable analyses and clearly communicated outputs.
  • Support digital underwriting initiatives, including assisting in the development and analysis of underwriting APIs that enable decisioning on borrower eligibility (“whom to lend”) and exposure sizing (“how much to lend”).
  • Develop and maintain periodic MIS and KPI reporting for key business functions (e.g., pipeline, disbursals, TAT, conversion, collections performance, portfolio trends).
  • Use Python (pandas, numpy) to clean, transform, and analyse datasets; automate recurring reports and data workflows.
  • Perform basic scripting to support data validation, extraction, and lightweight automation.


Required Skills and Qualifications:

  • Strong proficiency in Excel, including pivots, lookup functions, data cleaning, and structured analysis.
  • Strong working knowledge of SQL, including joins, aggregations, CTEs, and window functions.
  • Proficiency in Python for data analysis (pandas, numpy); ability to write clean, maintainable scripts/notebooks.
  • Strong logical reasoning and attention to detail, including the ability to identify errors and validate results rigorously.
  • Ability to work with ambiguous requirements and imperfect datasets while maintaining output quality.



Preferred (Good to Have):

  • REST APIs: A fundamental understanding of APIs and previous experience or projects related to API development/integrations.
  • Familiarity with basic AWS tools/services: (S3, lambda, EC2, Glue Jobs).
  • Experience with Git and basic engineering practices.
  • Any experience with the lending/finance industry.
Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
7 - 10 yrs
Upto ₹40L / yr (Varies
)
skill iconAmazon Web Services (AWS)
PySpark
SQL

We are hiring an Associate Technical Architect with strong expertise in AWS-based Data Platforms to design scalable data lakes, warehouses, and enterprise data pipelines while working with global teams.


Key Responsibilities

  • Design and implement scalable data warehouse, data lake, and lakehouse architectures on AWS
  • Build resilient and modular data pipelines using native AWS services
  • Architect cloud-based data platforms and evaluate service trade-offs
  • Optimize large-scale data processing and query performance
  • Collaborate with global cross-functional teams (Engineering, QA, PMs, Stakeholders)
  • Communicate technical roadmap, risks, and mitigation strategies

Must-Have Skills

  • 8+ years of experience in AWS Data Engineering / Data Architecture
  • Hands-on experience with AWS services:
  • Amazon S3
  • AWS Glue
  • AWS Lambda
  • Amazon EMR
  • AWS Kinesis (Streams & Firehose)
  • AWS Step Functions / MWAA
  • Amazon Redshift (Spectrum & Serverless)
  • Amazon Athena
  • Amazon RDS
  • AWS Lake Formation
  • AWS DMS, EventBridge, SNS, SQS
  • Strong programming skills in Python & PySpark
  • Advanced SQL with query optimization & performance tuning
  • Deep understanding of:
  • MPP databases
  • Partitioning & indexing strategies
  • Data modeling (Dimensional, Normalized, Lakehouse)
  • Experience building resilient ETL/data pipelines
  • Knowledge of AWS fundamentals:
  • Security
  • Networking
  • Disaster Recovery
  • Scalability & resilience
  • Experience with on-prem → AWS migrations
  • AWS Certification (Solution Architect Associate / Data Engineer Associate)

Good-to-Have Skills

  • Domain experience: FSI / Retail / CPG
  • Data governance & virtualization tools:
  • Collibra
  • Denodo
  • QuickSight / Power BI / Tableau
  • Exposure to:
  • Terraform (IaC)
  • CI/CD pipelines
  • SSIS
  • Apache NiFi, Hive, HDFS, Sqoop
  • Data Mesh architecture
  • Experience with NoSQL databases:
  • DynamoDB
  • MongoDB
  • DocumentDB

Soft Skills

  • Strong problem-solving and analytical mindset
  • Excellent communication and stakeholder management skills
  • Ability to translate technical concepts into business outcomes
  • Experience working with distributed/global teams
Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
3 - 6 yrs
Upto ₹28L / yr (Varies
)
skill iconAmazon Web Services (AWS)
Data engineering
PySpark
SQL
Data migration

As a Senior Data Engineer, you will be responsible for building and delivering a Lakehouse-based data pipeline. This is a hands-on role focused on implementing real-time and batch data ingestion, processing, and delivery workflows, while ensuring strong monitoring, observability, and data quality across the entire pipeline.

Must-Have Skills

  • 3+ years of hands-on experience building large-scale data pipelines
  • Strong experience with Spark Streaming, AWS Glue, and EMR for real-time and batch processing
  • Proficiency in PySpark/Python, including building Kafka producers for data ingestion
  • Experience working with Confluent Kafka and Spark Streaming for ingestion from on-premise sources
  • Solid understanding of AWS services including:
  1. S3
  2. Redshift
  3. Glue
  4. CloudWatch
  5. Secrets Manager
  • Experience working with Medallion Architecture and hybrid data destinations (e.g., Redshift + on-prem Oracle)
  • Ability to implement monitoring dashboards and observability using tools like CloudWatch or Datadog
  • Strong SQL skills for data validation and job-level metrics development
  • Experience building alerting mechanisms for pipeline failures and performance issues
  • Strong collaboration and communication skills
  • Proven ownership mindset — driving deliverables from design to deployment
  • Experience mentoring junior engineers, conducting code reviews, and guiding best practices
  • AWS Certified Data Engineer – Associate (preferred/required)

Good-to-Have Skills

  • Experience with orchestration tools such as Apache Airflow or AWS Step Functions
  • Exposure to Big Data ecosystem tools:
  1. Sqoop
  2. HDFS
  3. Hive
  4. NiFi
  • Exposure to Terraform for infrastructure automation
  • Familiarity with CI/CD pipelines for data workflows
Read more
Foyforyou
Mumbai
1 - 3 yrs
₹2L - ₹15L / yr
SQL
MS-Excel
Microsoft Excel
Operations management
skill iconData Analytics

Product Manager (Data & Operations)

Experience: 2+ years

Must-Have: Candidate must have prior experience in a product-based company

Role Summary

We are looking for a highly analytical Product Manager to drive business growth through data analysis, operational efficiency, and structured experimentation.

This role will focus on identifying growth opportunities, reducing operational inefficiencies, improving unit economics, and building strong reporting systems across the ecommerce and AI-led product ecosystem.

You will work closely with Engineering, Marketing, Catalog, Operations, Finance, and Data teams to ensure decisions are backed by data and execution is operationally strong.

Key ResponsibilitiesData Analysis & Business Insights

  • Own end-to-end funnel analysis (Impressions → CTR → ATC → Checkout → Purchase → Repeat)
  • Identify drop-offs, leakages, and revenue gaps using SQL, GA, Clevertap
  • Perform cohort analysis (new vs repeat, prepaid vs COD, personalized vs non-personalized users)
  • Track and improve core metrics:
  • Conversion Rate
  • GMV & Revenue
  • AOV
  • Repeat Rate
  • Cancellation & RTO %
  • Margin contribution
  • Build and maintain weekly/monthly dashboards for leadership visibility
  • Translate raw data into clear, actionable insights

Operational Excellence

  • Identify operational bottlenecks impacting conversion and fulfillment
  • Analyze cancellation drivers & reduce COD RTO risk
  • Improve payment success rates and checkout efficiency
  • Work with logistics teams to optimize delivery timelines
  • Collaborate with catalog & brand teams to improve SKU performance
  • Monitor inventory health, sell-through rate, and stock rotation
  • Drive pricing and margin optimization initiatives

Experimentation & Performance Improvement

  • Run structured A/B tests to improve funnel performance
  • Define clear hypotheses, success metrics, and impact measurement
  • Analyze experiment results and recommend rollouts
  • Build scalable processes for experimentation cadence

Cross-Functional Execution

  • Convert insights into PRDs and operational roadmaps
  • Partner with engineering for sprint-based delivery
  • Align marketing, catalog, and operations on metric ownership
  • Ensure every feature launch has measurable business KPIs

Must-Have Skills

  • Strong analytical mindset and comfort with large datasets
  • Advanced Excel / Google Sheets
  • Strong SQL proficiency (mandatory)
  • Experience with GA, Clevertap, Mixpanel or similar tools
  • Experience working on ecommerce funnels
  • Understanding of unit economics (GMV, margins, CAC, LTV)
  • Strong problem-solving and structured thinking

Bonus Skills

  • Experience in ecommerce marketplace or D2C
  • Experience working with logistics, payments, or inventory systems
  • Exposure to AI-led recommendation systems
  • Experience building business dashboards

What Success Looks Like (First 6 Months)

  • Clear dashboard visibility across all core business metrics
  • 10–15% improvement in funnel conversion
  • Reduction in cancellation & RTO rates
  • Improved operational turnaround time
  • Data-backed roadmap prioritization


Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
INDIA
5 - 20 yrs
₹5L - ₹28L / yr
Automation
m365
power platform
SQL
SAP
+6 more

Job Title: Power Automate


Experience : 5 to 10 Years

Location : Bangalore, Pune, Kochi (Hybrid Mode 3days WFO)​

Experience: 9+ Years 

JD For Power Automate 

∙We are seeking an Intelligent Automation Developer specialized in M365 Power Platform 

Power Automate to join our team of consultants focused on designing, building, and 

deploying digitial workforces that support of our clients' desires to transform their business 

and evolve their capabilites.  As a member of our Intelligent Automation team you will... 

 

∙Work collaboratively with our team and customers to understand their vision and objectives, 

and identify and document requirements. 

∙Design, develop, implement, and support process automation and agentic AI solutions that 

conform to business requirements. 

∙For Power Automate specialized positions; Design, Develop, optimize, and maintain Power 

Apps, Power Automate Flows, Power Pages, and Power BI solutions.  Copilot and Copilot 

Studio experience is preferred, while the desire and ability to learn AI-led automation tools 

upon joining our team is desired.   

∙Extend Power Automate solutions using Power Platform connectors and plugins. 

∙Integrate Power Automate solutions with existing systems and data sources such as 

ServiceNow, SQL, SharePoint, Snowflake, SAP, Dataverse, and Azure environments, and 

others like these. 

∙Utilize HTML, JavaScript, and CSS to customize the user interface and functionality of Power 

Apps and Power Pages for UI and interactive layer solutions in an end-to-end automation 

product. 

∙Ensure responsive design and Section 508 compliance for all deliverables. 

∙Participate in Agile / Agile Scrum development. 

∙Collaborate with cross-functional teams within Marlabs, the client, and potentially 3rd party 

partners and vendors. 

∙Communicate effectively with team members, business stakeholders, and end-users to 

gather requirements, create and run solution demonstrations, and provide project updates. 

∙Participate in configuration and code reviews. 

∙Create and maintain end-user, how to, help guides, checklists, and technical documentation. 

∙Integrate Power Platform with the ServiceNow platform. Using ServiceNow, support key ITIL 

 


processes and ServiceNow applications by developing and modifying the ServiceNow 

functionality. 

∙Create user documents, training, and other relevant materials that support end-users. 

∙Function as first-line support for Power Automate and related M365 and Power Platform 

solution end-users and client administrators. 

∙Performs fixes and enhancements as needed. 

∙Other duties and responsibilities as assigned. 

 

∙Required Experience - Power Automate project focus:  

∙Power Platform Development 

∙Power Apps Development 

∙Power Automate Development 

∙Power Pages Development 

Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
india
1 - 15 yrs
₹3L - ₹24L / yr
salesforce
Salesforce development
skill iconJavascript
LWC
Salesforce Apex
+11 more

Salesforce Developer


Location : ONSITE


LOCATION : MUMBAI AND BANGALORE


Resources should have banking domain experience.


1. Salesforce development Engineer (1 - 3 Years) 

2. Salesforce development Engineer (3 - 5 Years) 

3. Salesforce development Engineer (5 - 8 Years) 


Job description. 


----------------------------------------------------------------------------


Technical Skills:


Strong hands-on frontend development using JavaScript and LWC

Expertise in backend development using Apex, Flows, Async Apex

Understanding of Database concepts: SOQL, SOSL and SQL

Hands-on experience in API integration using SOAP, REST API, graphql

Experience with ETL tools , Data migration, and Data governance

Experience with Apex Design Patterns, Integration Patterns and Apex testing framework

Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab, bitbucket 

Should have worked with at least one programming language - Java, python, c++ and have good understanding of data structures

Preferred qualifications


Graduate degree in engineering

Experience developing with India stack

Experience in fintech or banking domain

----------------------------------------------------------------------------

 Skill details. 


1. Salesforce Fundamentals


Strong understanding of Salesforce core architecture

Objects (Standard vs Custom)

Fields, relationships (Lookup, Master-Detail)

Data model basics and record lifecycle

Awareness of declarative vs programmatic capabilities and when to use each

2. Salesforce Security Model

End-to-end understanding of Salesforce security layers, especially:

Record visibility when a record is created

Org-Wide Defaults (OWD) and their impact

Role Hierarchy and how it enables upward data access

Difference between Profiles, Permission Sets, and Sharing Rules

Ability to explain how Salesforce ensures that records are not visible to unauthorized users by default and how access is extended

3. Apex Triggers

Clear distinction between:

Before Triggers (before insert, before update)

Use cases such as validation and field updates

After Triggers (after insert, after update)

Use cases such as related record updates or integrations

Understanding of trigger context variables and best practices (bulkification, avoiding recursion)

4. Platform Events / Event-Driven Architecture

Knowledge of Platform Events and their use in decoupled, event-driven solutions

Understanding of real-time or near real-time notification use cases (e.g., UI alerts, pop-up style notifications)

Ability to position Platform Events versus alternatives (Streaming API, Change Data Capture)

5. Lightning Data Access (Wire Method)

Understanding of the @wire mechanism in Lightning Web Components (LWC)

Discussion point:

Whether records (e.g., AppX records) can be updated using the wire method

Awareness that @wire is primarily read/reactive and updates typically require imperative Apex calls

Clear articulation of reactive vs imperative data handling

6. Integrations Experience

Ability to articulate hands-on integration experience, including:

REST/SOAP API integrations

Inbound vs outbound integrations

Authentication mechanisms (OAuth, Named Credentials)

Use of Apex callouts, Platform Events, or middleware

Clarity on integration patterns and error handling approaches

Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
INDIA
6 - 25 yrs
₹10L - ₹25L / yr
skill icon.NET
dot net
Fullstack Developer
ASP.NET
skill iconC#
+12 more

.Net Full Stack Developer


Experience: 6-8 years of experience withbachelor's degree or equivalent

Overview:

Seeking an experienced Full Stack Developer with strong engineering practices, problem-solving

abilities, and excellent communication skills.

Required Skills:

●.NET Core, C#, SQL, Unit Testing, Design Patterns

●Message Queues (RabbitMQ or similar)

●Experience on working on SQL Server

●Jenkins, Git, Testing frameworks

●SCRUM/Agile methodologies

●Time management across multiple projects

Preferred Skills:

●AWS services (S3, API Gateway, SNS, SQS, RDS, CloudWatch)

●Docker Containers, Kubernetes

●IT Infrastructure understanding

●MongoDB/NoSQL databases

●Frontend frameworks (Stencil JS, Angular, React)

●Microservices architecture

Key Responsibilities:

●Deliver high-quality solutions

●Work independently and collaboratively

Read more
Chennai, Hyderabad
2 - 3 yrs
₹5L - ₹10L / yr
skill iconJava
skill iconSpring Boot
skill iconAngular (2+)
Production support
Troubleshooting
+5 more

Role Overview

We are seeking a technically strong Java Support Engineer who combines solid development knowledge with a passion for support and operational excellence. The ideal candidate should have hands-on experience in Java, Spring Boot, and Angular, along with a strong understanding of application engineering concepts, and must be comfortable working in a production support environment handling incidents, troubleshooting, monitoring, and system stability.


Key Responsibilities

  • Provide L2/L3 production support for enterprise applications.
  • Troubleshoot, debug, and resolve application issues within defined SLAs.
  • Analyze logs, identify root causes, and implement fixes or workarounds.
  • Collaborate with development teams for permanent issue resolution.
  • Monitor application health, performance, and availability.
  • Support deployments, releases, and environment validations.
  • Perform minor code fixes and enhancements when required.
  • Document issues, solutions, and support procedures.
  • Participate in on-call rotations and handle incident management.


Required Skills & Qualifications

  • Strong hands-on experience in Java and Spring Boot.
  • Working knowledge of Angular for frontend understanding.
  • Good understanding of application architecture, APIs, microservices, and debugging techniques.
  • Experience with log analysis tools, monitoring tools, and ticketing systems.
  • Knowledge of SQL databases and query troubleshooting.
  • Familiarity with Linux/Unix environments.
  • Understanding of CI/CD, release processes, and version control (Git).
  • Strong analytical, problem-solving, and communication skills.


Read more
Sceniuz

Sceniuz

Agency job
via AccioJob by lokit poddar
Mumbai
1 - 4 yrs
₹3L - ₹6L / yr
skill iconPython
SQL
Axure

AccioJob is conducting a Walk-In Hiring Drive with Sceniuz IT Pvt. Ltd. for the position of Data Engineer.


To apply, register and select your slot here: https://go.acciojob.com/kzxn79


Required Skills: Python, SQL, Azure


Eligibility:

Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc

Branch: All

Graduation Year: All


CTC: ₹3 LPA to ₹6 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Pune Centre

Further Rounds (for shortlisted candidates only):

Technical Interview 1, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/sUrMKd


👇 FAST SLOT BOOKING 👇

[ 📲 DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/NYDu6B

Read more
TalentXO
tabbasum shaikh
Posted by tabbasum shaikh
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹30L / yr
Backend Development
skill iconPython
skill iconJava
SQL


Role & Responsibilities

As a Founding Engineer, you'll join the engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.

This role is ideal for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems require creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.

Key Responsibilities

  • Build core platform features: Develop robust APIs, services, and integrations that power billing automation and revenue recognition capabilities.
  • Work across the full stack: Contribute to backend services and frontend interfaces to ensure seamless user experiences.
  • Implement critical integrations: Connect the platform with external systems including CRMs, data warehouses, ERPs, and payment processors.
  • Optimize for scale: Design systems that handle complex pricing models, high-volume usage data, and real-time financial calculations.
  • Drive quality and best practices: Write clean, maintainable code and participate in code reviews and architectural discussions.
  • Solve complex problems: Debug issues across the stack and collaborate with cross-functional teams to address evolving client needs.

The Impact You'll Make

  • Power business growth: Enable fast-growing B2B companies to scale billing and revenue operations efficiently.
  • Build critical financial infrastructure: Contribute to systems handling high-value transactions with accuracy and compliance.
  • Shape product direction: Join during a scaling phase where your contributions directly impact product evolution and customer success.
  • Accelerate your expertise: Gain deep exposure to financial systems, B2B SaaS operations, and enterprise-grade software development.
  • Drive the future of B2B commerce: Help build infrastructure supporting next-generation pricing models, from usage-based to value-based billing.

Ideal Candidate Profile

Experience

  • 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems.
  • Strong backend development experience using one or more frameworks: FastAPI / Django (Python), Spring (Java), or Express (Node.js).
  • Deep understanding of relevant libraries, tools, and best practices within the chosen backend framework.
  • Strong experience with databases (SQL & NoSQL), including efficient data modeling and performance optimization.
  • Proven experience designing, building, and maintaining APIs, services, and backend systems with solid system design and clean code practices.

Domain

  • Experience with financial systems, billing platforms, or fintech applications is highly preferred.

Company Background

  • Experience working in product companies or startups (preferably Series A to Series D).

Education

  • Candidates from Tier 1 engineering institutes (IITs, BITS, etc.) are highly preferred.



Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Thiruvananthapuram
9 - 12 yrs
₹21L - ₹27L / yr
skill iconJava
Spring
Apache Kafka
SQL
skill iconPostgreSQL
+16 more

JOB DETAILS:

Job Title: Java Lead-Java, MS, Kafka-TVM - Java (Core & Enterprise), Spring/Micronaut, Kafka

Industry: Global Digital Transformation Solutions Provider

Salary: Best in Industry

Experience: 9 to 12 years

Location: Trivandrum, Thiruvananthapuram

 

Job Description

Experience

  • 9+ years of experience in Java-based backend application development
  • Proven experience building and maintaining enterprise-grade, scalable applications
  • Hands-on experience working with microservices and event-driven architectures
  • Experience working in Agile and DevOps-driven development environments

 

Mandatory Skills

  • Advanced proficiency in core Java and enterprise Java concepts
  • Strong hands-on experience with Spring Framework and/or Micronaut for building scalable backend applications
  • Strong expertise in SQL, including database design, query optimization, and performance tuning
  • Hands-on experience with PostgreSQL or other relational database management systems
  • Strong experience with Kafka or similar event-driven messaging and streaming platforms
  • Practical knowledge of CI/CD pipelines using GitLab
  • Experience with Jenkins for build automation and deployment processes
  • Strong understanding of GitLab for source code management and DevOps workflows

 

Responsibilities

  • Design, develop, and maintain robust, scalable, and high-performance backend solutions
  • Develop and deploy microservices using Spring or Micronaut frameworks
  • Implement and integrate event-driven systems using Kafka
  • Optimize SQL queries and manage PostgreSQL databases for performance and reliability
  • Build, implement, and maintain CI/CD pipelines using GitLab and Jenkins
  • Collaborate with cross-functional teams including product, QA, and DevOps to deliver high-quality software solutions
  • Ensure code quality through best practices, reviews, and automated testing

 

Good-to-Have Skills

  • Strong problem-solving and analytical abilities
  • Experience working with Agile development methodologies such as Scrum or Kanban
  • Exposure to cloud platforms such as AWS, Azure, or GCP
  • Familiarity with containerization and orchestration tools such as Docker or Kubernetes

 

Skills: java, spring boot, kafka development, cicd, postgresql, gitlab

 

Must-Haves

Java Backend (9+ years), Spring Framework/Micronaut, SQL/PostgreSQL, Kafka, CI/CD (GitLab/Jenkins)

Advanced proficiency in core Java and enterprise Java concepts

Strong hands-oacn experience with Spring Framework and/or Micronaut for building scalable backend applications

Strong expertise in SQL, including database design, query optimization, and performance tuning

Hands-on experience with PostgreSQL or other relational database management systems

Strong experience with Kafka or similar event-driven messaging and streaming platforms

Practical knowledge of CI/CD pipelines using GitLab

Experience with Jenkins for build automation and deployment processes

Strong understanding of GitLab for source code management and DevOps workflows

 

 

*******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: only Trivandrum

F2F Interview on 21st Feb 2026

 

Read more
Remote only
3 - 6 yrs
₹4L - ₹7L / yr
skill iconNodeJS (Node.js)
skill iconPHP
skill iconReact Native
SQL
skill iconJavascript
+6 more

Software Developer (Node.js / PHP / React Native)

Experience: 3+ Years

Employment Type: Full-Time


Role Summary


We are looking for a skilled software developer with 3+ years of experience to work on enterprise platforms in EdTech, HRMS, CRM, and online examination systems. The role involves developing scalable web and mobile applications used by institutions and organizations.


Key Responsibilities

• Develop and maintain backend services using Node.js and PHP.

• Build and enhance mobile applications using React Native.

• Design and integrate REST APIs and third-party services.

• Work with databases (MySQL/PostgreSQL) for performance-driven applications.

• Collaborate with product, QA, and implementation teams for feature delivery.

• Troubleshoot, optimize, and ensure secure, high-performance systems.


Required Skills

• Strong experience in Node.js, PHP, and React Native.

• Good knowledge of JavaScript, API development, and database design.

• Experience with Git, version control, and deployment processes.

• Understanding of SaaS-based applications and modular architecture.


Preferred

• Experience in ERP, HRMS, CRM, or education/examination platforms.

• Familiarity with cloud environments and scalable deployments.


Qualification: B.Tech / MCA / BCA / Equivalent


Read more
CNV Labs India Pvt Ltd iCloudEMS
Shital ICloudEMS
Posted by Shital ICloudEMS
Remote only
3 - 5 yrs
₹3L - ₹5L / yr
skill iconPHP
SQL
skill iconNodeJS (Node.js)
skill iconReact Native
edtech

Role Summary


We are looking for a skilled Software Developer with 3+ years of experience to work on enterprise platforms in EdTech, HRMS, CRM, and Online Examination Systems. The role involves developing scalable web and mobile applications used by institutions and organizations.


Key Responsibilities

• Develop and maintain backend services using Node.js and PHP.

• Build and enhance mobile applications using React Native.

• Design and integrate REST APIs and third-party services.

• Work with databases (MySQL/PostgreSQL) for performance-driven applications.

• Collaborate with product, QA, and implementation teams for feature delivery.

• Troubleshoot, optimize, and ensure secure, high-performance systems.


Required Skills

• Strong experience in Node.js, PHP, and React Native.

• Good knowledge of JavaScript, API development, and database design.

• Experience with Git, version control, and deployment processes.

• Understanding of SaaS-based applications and modular architecture.


Preferred

• Experience in ERP, HRMS, CRM, or Education/Examination platforms.

• Familiarity with cloud environments and scalable deployments.


Qualification: B.Tech / MCA / BCA / Equivalent

Apply: Share your resume with project details and current CTC.

Read more
CNV Labs India Pvt Ltd iCloudEMS
Shital ICloudEMS
Posted by Shital ICloudEMS
Remote only
4 - 8 yrs
₹4L - ₹8L / yr
skill iconPHP
skill iconNodeJS (Node.js)
skill iconReact Native
SQL

We are looking for a skilled Node.js Developer with Rect Native experience to build, enhance, and maintain ERP and EdTech platforms. The role involves developing scalable backend services, integrating ERP modules, and supporting education-focused systems such as LMS, student management, exams, and fee management.


Key Responsibilities


Develop and maintain backend services using Node.js,Rect Native,PHP.


Build and integrate ERP modules for EdTech platforms (Admissions, Students, Exams, Attendance, Fees, Reports).


Design and consume RESTful APIs and third-party integrations (payment gateway, SMS, email).


Work with databases (MySQL / MongoDB / PostgreSQL) for high-volume education data.


Optimize application performance, scalability, and security.


Collaborate with frontend, QA, and product teams.


Debug, troubleshoot, and provide production support.


Required Skills


Strong experience in Node.js (Express.js / NestJS).


Working experience in PHP (Core PHP / Laravel / CodeIgniter).


Hands-on experience with ERP systems.


Domain experience in EdTech / Education ERP / LMS.


Strong knowledge of MySQL and database design.


Experience with authentication, role-based access, and reporting.


Familiarity with Git, APIs, and server environments.


Preferred Skills


Experience with online examination systems.


Knowledge of cloud platforms (AWS / Azure).


Understanding of security best practices (CSRF, XSS, SQL Injection).


Exposure to microservices or modular architecture.


Qualification


Bachelor’s degree in Computer Science or equivalent experience.


3–6 years of relevant experience in Node.js & PHP development


Skills:- NodeJS (Node.js), PHP, ERP management, EdTech, MySQL, API and Amazon Web Services (AWS)



Read more
CloudThat

at CloudThat

1 recruiter
shubhangi shrivastava
Posted by shubhangi shrivastava
Bengaluru (Bangalore)
3 - 6 yrs
₹7L - ₹10L / yr
skill iconHTML/CSS
skill iconPython
skill iconJava
SQL
skill iconC++
+2 more

About CloudThat:-

At CloudThat, we are driven by our mission to empower professionals and businesses to harness the full potential of cloud technologies. As a leader in cloud training and consulting services in India, our core values guide every decision we make and every customer interaction we have.


Role Overview:-

We are looking for a passionate and experienced Technical Trainer to join our expert team and help drive knowledge adoption across our customers, partners, and internal teams.


Key Responsibilities:

• Deliver high-quality, engaging technical training sessions both in-person and virtually to customers, partners, and internal teams.

• Design and develop training content, labs, and assessments based on business and technology requirements.

• Collaborate with internal and external SMEs to draft course proposals aligned with customer needs and current market trends.

• Assist in training and onboarding of other trainers and subject matter experts to ensure quality delivery of training programs.

• Create immersive lab-based sessions using diagrams, real-world scenarios, videos, and interactive exercises.

• Develop instructor guides, certification frameworks, learner assessments, and delivery aids to support end-to-end training delivery.

• Integrate hands-on project-based learning into courses to simulate practical environments and deepen understanding.

• Support the interpersonal and facilitation aspects of training fostering an inclusive, engaging, and productive learning environment


Skills & Qualifications:

• Experience developing content for professional certifications or enterprise skilling programs.

• Familiarity with emerging technology areas such as cloud computing, AI/ML, DevOps, or data engineering.


Technical Competencies:

  • Expertise in languages like C, C++, Python, Java
  • Understanding of algorithms and data structures 
  • Expertise on SQL 

Or Directly Apply-https://cloudthat.keka.com/careers/jobdetails/95441


Read more
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹4L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
TypeScript
skill iconJavascript
MySQL
+3 more

 Job Overview:

We are looking for a skilled Full Stack Developer with strong experience in Nextjs, Node.js, and React.js. The ideal candidate should be capable of building scalable web applications, leading modules, and contributing to both frontend and backend development

Key Responsibilities:

  • Design, develop, and maintain full-stack applications using Next.js, Node.js and React.js
  • Write clean, maintainable, and scalable code
  • Collaborate with cross-functional teams to define, design, and ship new features
  • Optimize applications for performance, scalability, and security
  • Mentor junior developers and conduct code reviews

Required Skills:

  • 4+ years of experience with Nextjs, React.js and Node.js
  • Strong knowledge of JavaScript, HTML, CSS
  • Experience with REST APIs, MongoDB, or SQL
  • Familiarity with version control (Git) and CI/CD tools


Why Join Us?

  • Career Advancement Opportunities and professional growth.
  • Supportive work environment with learning opportunities


Read more
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹13L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconNextJs (Next.js)
TypeScript
RESTful APIs
+3 more

Job Overview:

We are looking for a skilled Full Stack Developer with strong experience in Nextjs, Node.js, and React.js. The ideal candidate should be capable of building scalable web applications, leading modules, and contributing to both frontend and backend development

Key Responsibilities:

  • Design, develop, and maintain full-stack applications using Next.js, Node.js and React.js
  • Write clean, maintainable, and scalable code
  • Collaborate with cross-functional teams to define, design, and ship new features
  • Optimize applications for performance, scalability, and security
  • Mentor junior developers and conduct code reviews

Required Skills:

  • 4+ years of experience with Nextjs, React.js and Node.js
  • Strong knowledge of JavaScript, HTML, CSS
  • Experience with REST APIs, MongoDB, or SQL
  • Familiarity with version control (Git) and CI/CD tools


Why Join Us?

  • Career Advancement Opportunities and professional growth.
  • Supportive work environment with learning opportunities
Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 17 yrs
₹34L - ₹45L / yr
Dremio
Data engineering
Business Intelligence (BI)
Tableau
PowerBI
+51 more

Review Criteria:

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Role & Responsibilities:

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Mango Sciences
Remote only
5 - 7 yrs
₹10L - ₹15L / yr
skill iconPython
SQL
SQL quires

Database Programmer / Developer (SQL, Python, Healthcare)

Job Summary

We are seeking a skilled and experienced Database Programmer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining our database systems, with a strong focus on data integrity, performance, and security. The role requires expertise in SQL, strong programming skills in Python, and prior experience working within the healthcare domain to handle sensitive data and complex regulatory requirements.

Key Responsibilities

  • Design, implement, and maintain scalable and efficient database schemas and systems.
  • Develop and optimize complex SQL queries, stored procedures, and triggers for data manipulation and reporting.
  • Write and maintain Python scripts to automate data pipelines, ETL processes, and database tasks.
  • Collaborate with data analysts, software developers, and other stakeholders to understand data requirements and deliver robust solutions.
  • Ensure data quality, integrity, and security, adhering to industry standards and regulations such as HIPAA.
  • Troubleshoot and resolve database performance issues, including query tuning and indexing.
  • Create and maintain technical documentation for database architecture, processes, and applications.

Required Qualifications

  • Experience:
  • Proven experience as a Database Programmer, SQL Developer, or a similar role.
  • Demonstrable experience working with database systems, including data modeling and design.
  • Strong background in developing and maintaining applications and scripts using Python.
  • Direct experience within the healthcare domain is mandatory, including familiarity with medical data (e.g., patient records, claims data) and related regulatory compliance (e.g., HIPAA).
  • Technical Skills:
  • Expert-level proficiency in Structured Query Language (SQL) and relational databases (e.g., SQL Server, PostgreSQL, MySQL).
  • Solid programming skills in Python, including experience with relevant libraries for data handling (e.g., Pandas, SQLAlchemy).
  • Experience with data warehousing concepts and ETL (Extract, Transform, Load) processes.
  • Familiarity with version control systems, such as Git.

Preferred Qualifications

  • Experience with NoSQL databases (e.g., MongoDB, Cassandra).
  • Knowledge of cloud-based data platforms (e.g., AWS, GCP, Azure).
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Familiarity with other programming languages relevant to data science or application development.

Education

  • Bachelor’s degree in computer science, Information Technology, or a related field.

 

To process your resume for the next process, please fill out the Google form with your updated resume.


https://forms.gle/f7zgYAa632ww5Teb6

Read more
Remote only
2 - 4 yrs
₹3L - ₹4L / yr
skill icon.NET
SQL
skill iconPostgreSQL
RESTful APIs
skill iconGit
+4 more

We are looking for a highly skilled Full Stack Developer to design and scale our real-time vehicle tracking platform. You will be responsible for building high-performance web applications that process live GPS data and visualize it through interactive map interfaces.

Key Responsibilities

Real-Time Data Processing: Develop robust back-end services to ingest and process high-frequency GPS data from IoT devices.

Map Integration: Design and implement interactive map interfaces using tools like Google Maps API or Mapbox for real-time asset visualization.

Geofencing & Alerts: Build server-side logic for complex geospatial features, including geofencing, route optimization, and automated speed/entry alerts.

API Development: Create and maintain scalable RESTful or GraphQL APIs to bridge communication between vehicle hardware, the database, and the user dashboard.

Database Management: Architect and optimize databases (e.g., PostgreSQL with PostGIS) for efficient storage and querying of spatial-temporal data.

Performance Optimization: Ensure high availability and low-latency response times for tracking thousands of simultaneous vehicle connections.

Required Technical Skills

Front-End: Proficiency in React.js, Angular, or Vue.js, with experience in state management (Redux/MobX).

Back-End: Strong experience in Node.js (Express/NestJS), Python (Django/Flask), or Java (Spring Boot).

Mapping: Hands-on experience with Google Maps SDK, Leaflet, or OpenLayers.

Real-time Communication: Expertise in WebSockets or Socket.IO for live data streaming.

Databases: Proficiency in SQL (PostgreSQL/MySQL) and NoSQL (MongoDB/Redis) for caching.

Cloud & DevOps: Familiarity with AWS (EC2, Lambda), Docker, and Kubernetes for scalable deployment.

Qualifications

Education: Bachelor’s or Master’s degree in Computer Science or a related field.

Experience: 3–6+ years of professional full-stack development experience.

Niche Knowledge: Prior experience with telematics, IoT protocols (MQTT, HTTP), or GPS-based applications is highly preferred.

Read more
Performio

Performio

Agency job
via maple green services by Elvin Johnson
Remote only
4 - 6 yrs
₹15L - ₹20L / yr
ETL
SQL

The Opportunity:


As a Technical Support Consultant, you will play a significant role in Performio providing world

class support to our customers. With our tried and tested onboarding process, you will soon

become familiar with the Performio product and company.

You will draw on previous support experience to monitor for new support requests in

Zendesk, provide initial triage with 1st and 2nd level support, ensuring the customer is kept up

to date and the request is completed within a timely manner.

You will collaborate with other teams to ensure more complex requests are managed

efficiently and will provide feedback to help improve product and solution knowledge as well

as processes.

Answers to questions asked by customers that are not in the knowledge base will be

reviewed and added to the knowledge base if appropriate. We’re looking for someone who

thinks ahead, recognising opportunities to help customers help themselves.

You will help out with configuration changes and testing, furthering your knowledge and

experience of Performio. You may also be expected to help out with Managed Service,

Implementation and Work Order related tasks from time to time.


About Performio:


Performio is the last ICM software you’ll ever need. It allows you to manage incentive

compensation complexity and change over the long run by combining a structured plan

builder and flexible data management, with a partner who will make you a customer for life.

Our people are highly-motivated and engaged professionals with a clear set of values and

behaviors. We prove these values matter to us by living them each day. This makes Performio

both a great place to work and a great company to do business with.

But a great team alone is not sufficient to win. We have solved the fundamental issue

widespread in our industry—overly-rigid applications that cannot adapt to your needs, or

overly-flexible ones that become impossible to maintain over time. Only Performio allows you

to manage incentive compensation complexity and change over the long run by combining a

structured plan builder and flexible data management. The component-based plan builder

makes it easier to understand, change, and self-manage than traditional formula or

rules-based solutions. Our ability to Import data from any source, in any format, and perform

in-app data transformations, eliminate the pain of external processing and provides

end-to-end data visibility. The combination of these two functions, allows us to deliver more


powerful reporting and insights. And while every vendor says they are a partner, we truly are

one. We not only get your implementation right the first time, we enable you and give you the

autonomy and control to make changes year after year. And unlike most, we support every

part of your unique configuration. Performio is a partner that will make you a customer for life.

We have a global customer base across Australia, Asia, Europe, and the US in 25+ industries

that includes many well-known companies like Toll Brothers, Abbott Labs, News Corp,

Johnson & Johnson, Nikon, and Uber Freight.


What will you be doing:


● Monitoring and triaging new Support requests submitted by customers using our

Zendesk Support Portal

● Providing 1st and 2nd line support for Support requests

● Investigate, reproduce and resolve Customer issues within the required Service Level

Agreements

● Maintain our evolving knowledge base

● Clear and concise documentation of root causes and resolution

● Assist with the implementation and testing of Change Requests and implementation

projects

● As your knowledge of the product grows, make recommendations for solutions based

on client’s requests

● Assist in educating our client's compensation administrators applying best practices


What we’re looking for:


● Passion for customer service with a communication style that can be adapted to suit

the audience

● A problem solver with a range of troubleshooting methodologies

● Experience in the Sales Compensation industry

● Familiar with basic database concepts, spreadsheets and experienced in working with

large datasets (Excel, Relational Database Tables, SQL, ETL or other types of

tools/languages)

● 4+ years of experience in a similar role (experience with ICM software preferred)

● Experience with implementation & support of ICM solutions like SAP Commissions,

Varicent, Xactly will be a big plus

● Positive Attitude - optimistic, cares deeply about company and customers

● High Emotional IQ - shows empathy, listens when appropriate, creates healthy

conversation dynamic

● Resourceful - has a "I'll figure it out" attitude if something they need doesn't exist

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
5 - 8 yrs
₹11L - ₹20L / yr
PySpark
Apache Kafka
Data architecture
skill iconAmazon Web Services (AWS)
EMR
+32 more

JOB DETAILS:

* Job Title: Lead II - Software Engineering - AWS, Apache Spark (PySpark/Scala), Apache Kafka

* Industry: Global digital transformation solutions provider

* Salary: Best in Industry

* Experience: 5-8 years

* Location: Hyderabad

 

Job Summary

We are seeking a skilled Data Engineer to design, build, and optimize scalable data pipelines and cloud-based data platforms. The role involves working with large-scale batch and real-time data processing systems, collaborating with cross-functional teams, and ensuring data reliability, security, and performance across the data lifecycle.


Key Responsibilities

ETL Pipeline Development & Optimization

  • Design, develop, and maintain complex end-to-end ETL pipelines for large-scale data ingestion and processing.
  • Optimize data pipelines for performance, scalability, fault tolerance, and reliability.

Big Data Processing

  • Develop and optimize batch and real-time data processing solutions using Apache Spark (PySpark/Scala) and Apache Kafka.
  • Ensure fault-tolerant, scalable, and high-performance data processing systems.

Cloud Infrastructure Development

  • Build and manage scalable, cloud-native data infrastructure on AWS.
  • Design resilient and cost-efficient data pipelines adaptable to varying data volume and formats.

Real-Time & Batch Data Integration

  • Enable seamless ingestion and processing of real-time streaming and batch data sources (e.g., AWS MSK).
  • Ensure consistency, data quality, and a unified view across multiple data sources and formats.

Data Analysis & Insights

  • Partner with business teams and data scientists to understand data requirements.
  • Perform in-depth data analysis to identify trends, patterns, and anomalies.
  • Deliver high-quality datasets and present actionable insights to stakeholders.

CI/CD & Automation

  • Implement and maintain CI/CD pipelines using Jenkins or similar tools.
  • Automate testing, deployment, and monitoring to ensure smooth production releases.

Data Security & Compliance

  • Collaborate with security teams to ensure compliance with organizational and regulatory standards (e.g., GDPR, HIPAA).
  • Implement data governance practices ensuring data integrity, security, and traceability.

Troubleshooting & Performance Tuning

  • Identify and resolve performance bottlenecks in data pipelines.
  • Apply best practices for monitoring, tuning, and optimizing data ingestion and storage.

Collaboration & Cross-Functional Work

  • Work closely with engineers, data scientists, product managers, and business stakeholders.
  • Participate in agile ceremonies, sprint planning, and architectural discussions.


Skills & Qualifications

Mandatory (Must-Have) Skills

  1. AWS Expertise
  • Hands-on experience with AWS Big Data services such as EMR, Managed Apache Airflow, Glue, S3, DMS, MSK, and EC2.
  • Strong understanding of cloud-native data architectures.
  1. Big Data Technologies
  • Proficiency in PySpark or Scala Spark and SQL for large-scale data transformation and analysis.
  • Experience with Apache Spark and Apache Kafka in production environments.
  1. Data Frameworks
  • Strong knowledge of Spark DataFrames and Datasets.
  1. ETL Pipeline Development
  • Proven experience in building scalable and reliable ETL pipelines for both batch and real-time data processing.
  1. Database Modeling & Data Warehousing
  • Expertise in designing scalable data models for OLAP and OLTP systems.
  1. Data Analysis & Insights
  • Ability to perform complex data analysis and extract actionable business insights.
  • Strong analytical and problem-solving skills with a data-driven mindset.
  1. CI/CD & Automation
  • Basic to intermediate experience with CI/CD pipelines using Jenkins or similar tools.
  • Familiarity with automated testing and deployment workflows.

 

Good-to-Have (Preferred) Skills

  • Knowledge of Java for data processing applications.
  • Experience with NoSQL databases (e.g., DynamoDB, Cassandra, MongoDB).
  • Familiarity with data governance frameworks and compliance tooling.
  • Experience with monitoring and observability tools such as AWS CloudWatch, Splunk, or Dynatrace.
  • Exposure to cost optimization strategies for large-scale cloud data platforms.

 

Skills: big data, scala spark, apache spark, ETL pipeline development

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Hyderabad

Note: If a candidate is a short joiner, based in Hyderabad, and fits within the approved budget, we will proceed with an offer

F2F Interview: 14th Feb 2026

3 days in office, Hybrid model.

 


Read more
Bengaluru (Bangalore)
4 - 6 yrs
₹8L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconHTML/CSS
skill iconJavascript
SQL
+2 more


Key Responsibilities :


- Develop backend services using Node.js, including API orchestration and integration with AI/ML services.


- Implement frontend redaction features using Redact.js, integrated into React.js dashboards.


- Collaborate with AI/ML engineers to embed intelligent feedback and behavioral analysis.


- Build secure, multi-tenant systems with role-based access control (RLS).


- Optimize performance for real-time audio analysis and transcript synchronization.


- Participate in agile grooming sessions and contribute to architectural decisions.


Required Skills :


- Experience with React.js or similar annotation/redaction libraries.


- Strong understanding of RESTful APIs, React.js, and Material-UI.


- Familiarity with Azure services, SQL, and authentication protocols (SSO, JWT).


- Experience with secure session management and data protection standards.


Preferred Qualifications :


- Exposure to AI/ML workflows and Python-based services.


- Experience with Livekit or similar real-time communication platforms.


- Familiarity with Power BI and accessibility standards (WCGA).


Soft Skills :


- Problem-solving mindset and adaptability.


- Ability to work independently and meet tight deadlines.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Mumbai, Pune
3 - 6 yrs
Best in industry
skill iconPython
PySpark
pandas
SQL
ADF
+2 more

* Python (3 to 6 years): Strong expertise in data workflows and automation

* Spark (PySpark): Hands-on experience with large-scale data processing

* Pandas: For detailed data analysis and validation

* Delta Lake: Managing structured and semi-structured datasets at scale

* SQL: Querying and performing operations on Delta tables

* Azure Cloud: Compute and storage services

* Orchestrator: Good experience with either ADF or Airflow

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
9 - 12 yrs
₹53L - ₹70L / yr
skill iconJava
Microservices
CI/CD
MySQL
Scripting
+5 more

JOB DETAILS:

* Job Title: Engineering Manager

* Industry: Technology

* Salary: Best in Industry

* Experience: 9-12 years

* Location: Bengaluru

* Education: B.Tech in computer science or related field from Tier 1, Tier 2 colleges


Role & Responsibilities

We are seeking a visionary and decisive Engineering Manager to join our dynamic team. In this role, you will lead and inspire a talented team of software engineers, driving innovation and excellence in product development efforts. This is an exciting opportunity to influence and shape the future of our engineering organization.

 

Key Responsibilities-

As an Engineering Manager, you will be responsible for managing the overall software development life cycle of one product. You will work and manage a cross-functional team consisting of Backend Engineers, Frontend Engineers, QA, SDET, Product Managers, Product Designers, Technical Project Managers, Data Scientists, etc.

  • Responsible for mapping business objectives to an optimum engineering structure, including correct estimation of resource allocation.
  • Responsible for key technical and product decisions. Provide direction and mentorship to the team. Set up best practices for engineering.
  • Work closely with the Product Manager and help them in getting relevant inputs from the engineering team.
  • Plan and track the development and release schedules, proactively assess and mitigate risks. Prepare for contingencies and provide visible leadership in crisis.
  • Conduct regular 1:1s for performance feedback and lead their appraisals.
  • Responsible for driving good coding practices in the team like good quality code, documentation, timely bug fixing, etc.
  • Report on the status of development, quality, operations, and system performance to management.
  • Create and maintain an open and transparent environment that values speed and innovation and motivates engineers to build innovative and effective systems rapidly.


Ideal Candidate

  • Strong Engineering Manager / Technical Leadership Profile
  • Must have 9+ years of experience in software engineering with experience building complex, large-scale products
  • Must have 2+ years of experience as an Engineering Manager / Tech Lead with people management responsibilities
  • Strong technical foundation with hands-on experience in Java (or equivalent compiled language), scripting languages, web technologies, and databases (SQL/NoSQL)
  • Proven ability to solve large-scale technical problems and guide teams on architecture, design, quality, and best practices
  • Experience in leading cross-functional teams, planning and tracking delivery, mentoring engineers, conducting performance reviews, and driving engineering excellence
  • Must have strong experience working with Product Managers, UX designers, QA, and other cross-functional partners
  • Excellent communication and interpersonal skills to influence technical direction and stakeholder decisions
  • (Company): Product companies
  • Must have stayed for at least 2 years with each of the previous companies
  • (Education): B.Tech in computer science or related field from Tier 1, Tier 2 colleges
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Thiruvananthapuram, Trivandrum
5 - 9 yrs
₹13L - ₹25L / yr
skill icon.NET
skill iconJavascript
skill iconAngular (2+)
Windows Azure
SQL Azure
+13 more

Job Details

- Job Title: Specialist I - Software Engineering-.Net Fullstack Lead-TVM

Industry: Global digital transformation solutions provider

Domain - Information technology (IT)

Experience Required: 5-9 years

Employment Type: Full Time

Job Location: Trivandrum, Thiruvananthapuram

CTC Range: Best in Industry

 

Job Description

· Minimum 5+ years experienced senior/Lead .Net developer, including experience of the full development lifecycle, including post-live support.

· Significant experience delivering software using Agile iterative delivery methodologies.

· JIRA knowledge preferred.

· Excellent ability to understand requirement/story scope and visualise technical elements required for application solutions.

· Ability to clearly articulate complex problems and solutions in terms that others can understand.

· Lots of experience working with .Net backend API development.

· Significant experience of pipeline design, build and enhancement to support release cadence targets, including Infrastructure as Code (preferably Terraform).

· Strong understanding of HTML and CSS including cross-browser, compatibility, and performance.

· Excellent knowledge of unit and integration testing techniques.

· Azure knowledge (Web/Container Apps, Azure Functions, SQL Server).

· Kubernetes / Docker knowledge. Knowledge of JavaScript UI frameworks, ideally Vue Extensive experience with source control (preferably Git).

· Strong understanding of RESTful services (JSON) and API Design.

· Broad knowledge of Cloud infrastructure (PaaS, DBaaS).

· Experience of mentoring and coaching engineers operating within a co-located environment. 

 

Skills: .Net Fullstack, Azure Cloudformation, Javascript, Angular

 

Must-Haves:

.Net (5+ years), Agile methodologies, RESTful API design, Azure (Web/Container Apps, Functions, SQL Server), Git source control

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Trivandrum

F2F Weekend Interview on 14th Feb 2026

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
6 - 10 yrs
Best in industry
skill iconJava
skill iconJavascript
skill iconSpring Boot
Microservices
Hibernate (Java)
+6 more

Company Description

NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.


Role Description

This is a full-time hybrid role for a Java Software Engineer, based in Pune. The Java Software Engineer will be responsible for designing, developing, and maintaining software applications. Key responsibilities include working with microservices architecture, implementing and managing the Spring Framework, and programming in Java. Collaboration with cross-functional teams to define, design, and ship new features is also a key aspect of this role.


Responsibilities:

● Develop and Maintain: Write clean, efficient, and maintainable code for Java-based applications 

● Collaborate: Work with cross-functional teams to gather requirements and translate them into technical solutions 

● Code Reviews: Participate in code reviews to maintain high-quality standards 

● Troubleshooting: Debug and resolve application issues in a timely manner 

● Testing: Develop and execute unit and integration tests to ensure software reliability

● Optimize: Identify and address performance bottlenecks to enhance application performance 


Qualifications & Skills:

● Strong knowledge of Java, Spring Framework (Spring Boot, Spring MVC), and Hibernate/JPA 

● Familiarity with RESTful APIs and web services 

● Proficiency in working with relational databases like MySQL or PostgreSQL 

● Practical experience with AWS cloud services and building scalable, microservices-based architectures

● Experience with build tools like Maven or Gradle 

● Understanding of version control systems, especially Git 

● Strong understanding of object-oriented programming principles and design patterns 

● Familiarity with automated testing frameworks and methodologies 

● Excellent problem-solving skills and attention to detail 

● Strong communication skills and ability to work effectively in a collaborative team environment 


Why Join Us? 

● Opportunity to work on cutting-edge technology products 

● A collaborative and learning-driven environment 

● Exposure to AI and software engineering innovations 

● Excellent work ethic and culture 


If you're passionate about technology and want to work on impactful projects, we'd love to hear from you

Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Bengaluru (Bangalore)
2 - 3 yrs
Best in industry
Apache Airflow
azkaban
skill iconAmazon Web Services (AWS)
skill iconPython
Pipeline management
+7 more

Responsibilities:

  • Design, develop, and maintain efficient and reliable data pipelines.
  • Identify and implement process improvements, automating manual tasks and optimizing data delivery.
  • Build and maintain the infrastructure for data extraction, transformation, and loading (ETL) from diverse sources using SQL and AWS cloud technologies.
  • Develop data tools and solutions to empower our analytics and data science teams, contributing to product innovation.


Qualifications:

Must Have:

  • 2-3 years of experience in a Data Engineering role.
  • Familiarity with data pipeline and workflow management tools (e.g., Airflow, Luigi, Azkaban).
  • Experience with AWS cloud services.
  • Working knowledge of object-oriented/functional scripting in Python
  • Experience building and optimizing data pipelines and datasets.
  • Strong analytical skills and experience working with structured and unstructured data.
  • Understanding of data transformation, data structures, dimensional modeling, metadata management, schema evolution, and workload management.
  • A passion for building high-quality, scalable data solutions.


Good to have:

  • Experience with stream-processing systems (e.g., Spark Streaming, Flink).
  • Working knowledge of message queuing, stream processing, and scalable data stores.
  • Proficiency in SQL and experience with NoSQL databases like Elasticsearch and Cassandra/MongoDB.


Experience with big data tools such as HDFS/S3, Spark/Flink, Hive, HBase, Kafka/Kinesis.

Read more
Remote only
0 - 1 yrs
₹1L - ₹1.8L / yr
skill icon.NET
SQL
SQL server
skill iconjQuery
LINQ
+3 more

Position: .Net Core Intern (.Net Core Knowledge is must)

Education: BTech-Computer Science Only

Joining: Immediate Joiner

Work Mode: Remote

Working Days: Monday to Friday

Shift: Rotational – based on project need):

·      5:00 PM – 2:00 AM IST

·      6:00 PM – 3:00 AM IST

 

Job Summary

ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.

 

Eligibility & Qualifications

  • Education:
  • B.Tech (Computer Science) / M.Tech (Computer Science)
  • Tier 1 colleges preferred
  • Experience Level: Fresher
  • Communication: Excellent English communication skills (verbal & written)

Skills Required

1. Technical Skills (Must Have)

  • Experience with .NET Core (.NET 6 / 7 / 8)
  • Strong knowledge of C#, including:
  • Object-Oriented Programming (OOP) concepts
  • async/await
  • LINQ
  • ASP.NET Core (Web API / MVC)

2. Database Skills

  • SQL Server (preferred)
  • Writing complex SQL queries, joins, and subqueries
  • Stored Procedures, Functions, and Indexes
  • Database design and performance tuning
  • Entity Framework Core
  • Migrations and transaction handling

3. Frontend Skills (Required)

  • JavaScript (ES5 / ES6+)
  • jQuery
  • DOM manipulation
  • AJAX calls
  • Event handling
  • HTML5 & CSS3
  • Client-side form validation

4. Security & Performance

  • Data validation and exception handling
  • Caching concepts (In-memory / Redis – good to have)

5. Tools & Environment

  • Visual Studio / VS Code
  • Git (GitHub / Azure DevOps)
  • Basic knowledge of server deployment

6. Good to Have (Optional)

  • Azure or AWS deployment experience
  • CI/CD pipelines
  • Docker
  • Experience with data handling

 

Work Environment & Tools

  • Comfortable working in a remote setup
  • Familiarity with collaboration and remote access tools

 

Additional Requirements (Work-from-Home Setup)

This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:

  • System: Laptop or Desktop (Windows-based)
  • Operating System: Windows
  • Screen Size: Minimum 14 inches
  • Screen Resolution: Full HD (1920 × 1080)
  • Processor: Intel i5 or higher
  • RAM: Minimum 8 GB (Mandatory)
  • Software: AnyDesk
  • Internet Speed: 100 Mbps or higher

 

About ARDEM

 

ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.

 

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Delhi, Kolkata, Bengaluru (Bangalore), Kochi (Cochin), Hosur, Trivandrum
7 - 9 yrs
₹5.5L - ₹20L / yr
skill icon.NET
skill iconAmazon Web Services (AWS)
skill iconC#
skill iconReact.js
SQL

Job Description -

Profile: .Net Full Stack Lead

Experience Required: 7–12 Years

Location: Pune, Bangalore, Chennai, Coimbatore, Delhi, Hosur, Hyderabad, Kochi, Kolkata, Trivandrum

Work Mode: Hybrid

Shift: Normal Shift

Key Responsibilities:

  • Design, develop, and deploy scalable microservices using .NET Core and C#
  • Build and maintain serverless applications using AWS services (Lambda, SQS, SNS)
  • Develop RESTful APIs and integrate them with front-end applications
  • Work with both SQL and NoSQL databases to optimize data storage and retrieval
  • Implement Entity Framework for efficient database operations and ORM
  • Lead technical discussions and provide architectural guidance to the team
  • Write clean, maintainable, and testable code following best practices
  • Collaborate with cross-functional teams to deliver high-quality solutions
  • Participate in code reviews and mentor junior developers
  • Troubleshoot and resolve production issues in a timely manner

Required Skills & Qualifications:

  • 7–12 years of hands-on experience in .NET development
  • Strong proficiency in .NET Framework.NET Core, and C#
  • Proven expertise with AWS services (Lambda, SQS, SNS)
  • Solid understanding of SQL and NoSQL databases (SQL Server, MongoDB, DynamoDB, etc.)
  • Experience building and deploying Microservices architecture
  • Proficiency in Entity Framework or EF Core
  • Strong knowledge of RESTful API design and development
  • Experience with React or Angular is a good to have
  • Understanding of CI/CD pipelines and DevOps practices
  • Strong debugging, performance optimization, and problem-solving skills
  • Experience with design patterns, SOLID principles, and best coding practices
  • Excellent communication and team leadership skills


Read more
Truetech solutions

Truetech solutions

Agency job
via TrueTech Solutions by Meimozhi balu
Bengaluru (Bangalore), Kochi (Cochin)
4 - 15 yrs
₹10L - ₹25L / yr
skill icon.NET
ASP.NET
skill iconAmazon Web Services (AWS)
Amazon EC2
AWS Lambda
+2 more

• Minimum 4+ years of years

• Experience in designing, developing, and maintain backend services using C# 12 and .NET 8 or .NET 9

• Experience in building and operating cloud native and serverless applications on AWS

• Experience in developing and integrating services using AWS lambda, API Gateway , dynamo DB, Eventbridge, CloudWatch, SQS, SNS, Kinesis, Secret Manager, S3 storage, server architectural models etc.

Experience in integrating services using AWS SDK

• Should be cognizant of the OMS paradigms including Inventory Management, Inventory publish, supply feed processing, control mechanisms, ATP publish, Order Orchestration, workflow set up and customizations, integrations with tax, AVS, payment engines, sourcing algorithms and managing reservations with back orders, schedule mechanisms, flash sales management etc.

• Should have a decent End to End knowledge of various Commerce subsystems which include Storefront, Core Commerce back end, Post Purchase processing, OMS, Store / Warehouse Management processes, Supply Chain and Logistic processes. This is to ascertain candidates knowhow on the overall Retail landscape of any customer.

• Strong knowledge in Querying in Oracle DB and SQL Server

• Able to read, write and manage PLSQL procedures in oracle.

• Strong debugging, performance tuning and problem solving skills

• Experience with event driven and micro services architectures

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
9 - 12 yrs
₹50L - ₹70L / yr
skill iconJava
Microservices
CI/CD
MySQL
MySQL DBA
+9 more

Job Details

- Job Title: Staff Engineer

Industry: Technology

Domain - Information technology (IT)

Experience Required: 9-12 years

Employment Type: Full Time

Job Location: Bengaluru

CTC Range: Best in Industry

 

Role & Responsibilities

As a Staff Engineer at company, you will play a critical role in defining and driving our backend architecture as we scale globally. You’ll own key systems that handle high volumes of data and transactions, ensuring performance, reliability, and maintainability across distributed environments.

 

Key Responsibilities-

  • Own one or more core applications end-to-end, ensuring reliability, performance, and scalability.
  • Lead the design, architecture, and development of complex, distributed systems, frameworks, and libraries aligned with company’s technical strategy.
  • Drive engineering operational excellence by defining robust roadmaps for system reliability, observability, and performance improvements.
  • Analyze and optimize existing systems for latency, throughput, and efficiency, ensuring they perform at scale.
  • Collaborate cross-functionally with Product, Data, and Infrastructure teams to translate business requirements into technical deliverables.
  • Mentor and guide engineers, fostering a culture of technical excellence, ownership, and continuous learning.
  • Establish and uphold coding standards, conduct design and code reviews, and promote best practices across teams.
  • Stay ahead of the curve on emerging technologies, frameworks, and patterns to strengthen company’s technology foundation.
  • Contribute to hiring by identifying and attracting top-tier engineering talent.

 

Ideal Candidate

  • Strong staff engineer profile
  • Must have 9+ years in backend engineering with Java, Spring/Spring Boot, and microservices building large and schalable systems
  • Must have been SDE-3 / Tech Lead / Lead SE for at least 2.5 years
  • Strong in DSA, system design, design patterns, and problem-solving
  • Proven experience building scalable, reliable, high-performance distributed systems
  • Hands-on with SQL/NoSQL databases, REST/gRPC APIs, concurrency & async processing
  • Experience in AWS/GCP, CI/CD pipelines, and observability/monitoring
  • Excellent ability to explain complex technical concepts to varied stakeholders
  • Product companies (B2B SAAS preferred)
  • Must have stayed for at least 2 years with each of the previous companies
  • (Education): B.Tech in computer science from Tier 1, Tier 2 colleges


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Mumbai
2 - 6 yrs
₹2L - ₹8L / yr
Linux/Unix
Linux administration
Apache
Apache Tomcat
JBoss
+6 more

Job Title : System Support Engineer – L1

Experience : 2.5+ Years

Location : Mumbai (Powai)

Shift : Rotational


Role Summary :

Provide first-level technical and functional support for enterprise applications and infrastructure. Handle user issues, troubleshoot systems, and ensure timely resolution while following support processes.


Key Responsibilities :

  • Provide phone/email support and own user issues end-to-end.
  • Log, track, and update tickets in Jira/Freshdesk.
  • Troubleshoot Linux/UNIX systems, web servers, and databases.
  • Escalate unresolved issues and communicate during downtimes.
  • Create knowledge base articles and support documentation.


Mandatory Skills :

Linux/UNIX administration, Apache/Tomcat/JBoss, basic SQL databases (MySQL/SQL Server/Oracle), scripting knowledge, and ticketing tools experience.


Preferred :

  • Banking/Financial Services domain exposure and client-site support experience.
  • Strong communication skills, customer-focused mindset, and willingness to work in rotational shifts are essential.
Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Kochi (Cochin), Chennai, Thiruvananthapuram
5 - 7 yrs
₹19L - ₹28L / yr
skill iconJava
skill iconSpring Boot
Microservices
Architecture
Google Cloud Platform (GCP)
+22 more

Job Details

- Job Title: Lead I - Software Engineering - Java, Spring Boot, Microservices

- Industry: Global digital transformation solutions provider

- Domain - Information technology (IT)

- Experience Required: 5-7 years

- Employment Type: Full Time

- Job Location: Trivandrum, Chennai, Kochi, Thiruvananthapuram

- CTC Range: Best in Industry

 

Job Description

Job Title: Senior Java Developer Experience: 5+ years

Job Summary:

We are looking for a Senior Java Developer with strong experience in Spring Boot and Microservices to work on high-performance applications for a leading financial services client. The ideal candidate will have deep expertise in Java backend development, cloud (preferably GCP), and strong problem-solving abilities.

 

Key Responsibilities:

• Develop and maintain Java-based microservices using Spring Boot

• Collaborate with Product Owners and teams to gather and review requirements

• Participate in design reviews, code reviews, and unit testing

• Ensure application performance, scalability, and security

• Contribute to solution architecture and design documentation

• Support Agile development processes including daily stand-ups and sprint planning

• Mentor junior developers and lead small modules or features

 

Required Skills:

• Java, Spring Boot, Microservices architecture

• GCP (or other cloud platforms like AWS)

• REST/SOAP APIs, Hibernate, SQL, Tomcat

• CI/CD tools: Jenkins, Bitbucket

• Agile methodologies (Scrum/Kanban)

• Unit testing (JUnit), debugging and troubleshooting

• Good communication and team leadership skills

 

Preferred Skills:

• Frontend familiarity (Angular, AJAX)

• Experience with API documentation tools (Swagger)

• Understanding of design patterns and UML

• Exposure to Confluence, Jira

 

Mandatory Skills Required:

Strong proficiency in Java, spring boot, Microservices, GCP/AWS.

Experience Required: Minimum 5+ years of relevant experience

Java/J2EE (5+ years), Spring/Spring Boot (5+ years), Microservices (5+ years), AWS/GCP/Azure (mandatory), CI/CD (Jenkins, SonarQube, Git)

Java, Spring Boot, Microservices architecture

GCP (or other cloud platforms like AWS)

REST/SOAP APIs, Hibernate, SQL, Tomcat

CI/CD tools: Jenkins, Bitbucket

Agile methodologies (Scrum/Kanban)

Unit testing (JUnit), debugging and troubleshooting

Good communication and team leadership skills

 

******

Notice period - 0 to 15 days only (Immediate and who can join by Feb)

Job stability is mandatory

Location: Trivandrum, Kochi, Chennai

Virtual Interview - 14th Feb 2026

Read more
suntekai
Kushi A
Posted by Kushi A
Remote only
0 - 1 yrs
₹10000 - ₹12000 / mo
skill iconPython
skill iconPostgreSQL
Data Visualization
Business Intelligence (BI)
SQL
+2 more

Job Description: Data Analyst


About the Role

We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..

Key Responsibilities

  • Data Extraction & Management

  • Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.

  • Ensure accuracy, reliability, and consistency of data across different platforms.

  • Data Analysis & Insights

  • Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.

  • Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.

  • Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.

  • Business Intelligence & Visualization

  • Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).

  • Create visualizations that simplify complex datasets for stakeholders and management.

  • Python (Pandas)

  • Use Python (Pandas, NumPy) for advanced analytics.

  • Collaboration & Stakeholder Management

  • Work closely with product, operations, and leadership teams to provide insights that drive decision-making.

  • Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.

Required Skills

  • SQL/PostgreSQL

  • Complex joins, window functions, CTEs, aggregations, query optimization.

  • Python (Pandas & Analytics)

  • Data wrangling, cleaning, transformations, exploratory data analysis (EDA).

  • Libraries: Pandas, NumPy, Matplotlib, Seaborn

  • Data Visualization & BI Tools

  • Expertise in creating dashboards and reports using Metabase or Looker.

  • Ability to translate raw data into meaningful visual insights.

  • Business Intelligence

  • Strong analytical reasoning to connect data insights with e-commerce KPIs.

  • Experience in funnel analysis, customer journey mapping, and retention analysis.

  • Analytics & E-commerce Knowledge

  • Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.

  • General Skills

  • Strong communication and presentation skills.

  • Ability to work cross-functionally in fast-paced environments.

  • Problem-solving mindset with attention to detail.



Education: Bachelor’s degree in Data Science, Computer Science, data processing




Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
2 - 5 yrs
₹4L - ₹5L / yr
DevOps
Windows Azure
CI/CD
MySQL
skill iconPython
+12 more

JOB DETAILS:

* Job Title: DevOps Engineer (Azure)

* Industry: Technology

* Salary: Best in Industry

* Experience: 2-5 years

* Location: Bengaluru, Koramangala

Review Criteria

  • Strong Azure DevOps Engineer Profiles.
  • Must have minimum 2+ years of hands-on experience as an Azure DevOps Engineer with strong exposure to Azure DevOps Services (Repos, Pipelines, Boards, Artifacts).
  • Must have strong experience in designing and maintaining YAML-based CI/CD pipelines, including end-to-end automation of build, test, and deployment workflows.
  • Must have hands-on scripting and automation experience using Bash, Python, and/or PowerShell
  • Must have working knowledge of databases such as Microsoft SQL Server, PostgreSQL, or Oracle Database
  • Must have experience with monitoring, alerting, and incident management using tools like Grafana, Prometheus, Datadog, or CloudWatch, including troubleshooting and root cause analysis

 

Preferred

  • Knowledge of containerisation and orchestration tools such as Docker and Kubernetes.
  • Knowledge of Infrastructure as Code and configuration management tools such as Terraform and Ansible.
  • Preferred (Education) – BE/BTech / ME/MTech in Computer Science or related discipline

 

Role & Responsibilities

  • Build and maintain Azure DevOps YAML-based CI/CD pipelines for build, test, and deployments.
  • Manage Azure DevOps Repos, Pipelines, Boards, and Artifacts.
  • Implement Git branching strategies and automate release workflows.
  • Develop scripts using Bash, Python, or PowerShell for DevOps automation.
  • Monitor systems using Grafana, Prometheus, Datadog, or CloudWatch and handle incidents.
  • Collaborate with dev and QA teams in an Agile/Scrum environment.
  • Maintain documentation, runbooks, and participate in root cause analysis.

 

Ideal Candidate

  • 2–5 years of experience as an Azure DevOps Engineer.
  • Strong hands-on experience with Azure DevOps CI/CD (YAML) and Git.
  • Experience with Microsoft Azure (OCI/AWS exposure is a plus).
  • Working knowledge of SQL Server, PostgreSQL, or Oracle.
  • Good scripting, troubleshooting, and communication skills.
  • Bonus: Docker, Kubernetes, Terraform, Ansible experience.
  • Comfortable with WFO (Koramangala, Bangalore).


Read more
Reliable Group

at Reliable Group

2 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
10yrs+
Upto ₹42L / yr (Varies
)
skill icon.NET
.NET Compact Framework
SQL
Windows Azure
CI/CD
+5 more

Application Architect – .NET

Role Overview

We are looking for a senior, hands-on Application Architect with deep .NET experience who can fix and modernize our current systems and build a strong engineering team over time.

Important – This role hands-on with architectural mindset. This person should be comfortable working with legacy systems and can make and explain tradeoffs.


Key Responsibilities

Application Architecture & Modernization

  • Own application architecture across legacy .NET Framework and modern .NET systems
  • Review the existing application, and drive an incremental modernization approach along with new feature development as per business growth of the company.
  • Own the gradual move away from outdated patterns (Web Forms, tightly coupled MVC, legacy UI constructs)
  • Define clean API contracts between front-end and backend services
  • Identify and resolve performance bottlenecks across code and database layers
  • Improve data access patterns, caching strategies, and system responsiveness
  • Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.


Backend, APIs & Integrations

  • Design scalable backend services and APIs
  • Improve how newer .NET services interact with legacy systems
  • Lead integrations with external systems, including Zoho
  • Prior experience integrating with Zoho (CRM, Finance, or other modules) is a strong value add
  • Experience designing and implementing integrations using EDI standards


Data & Schema Design

  • Review existing database schemas and core data structures
  • Redesign data models to support growth, and reporting/analytics requirements
  • Optimize SǪL queries to reduce the load on execution and DB engine


Cloud Awareness

  • Design applications with cloud deployment in mind (primarily Azure)
  • Understand how to use Azure services to improve security, scalability, and availability
  • Work with Cloud and DevOps teams to ensure application architecture aligns with cloud best practices
  • Push for CI/CD automation so that team pushes code regularly and makes progress.


Team Leadership & Best Practices

  • Act as a technical leader and mentor for the engineering team
  • Help hire, onboard, and grow a team under this role over time.
  • Define KPIs and engineering best practices (including focus on documentation)
  • Set coding standards, architectural guidelines, and review practices
  • Improve testability and long-term health of the codebase
  • Raise the overall engineering bar through reviews, coaching, and clear standards
  • Create a culture of ownership and quality


Cross-Platform Thinking

  • Strong communicator who can convert complex tech topics into business-friendly lingo. Understands the business needs and importance of user experience
  • While .NET is the core stack, contribute to architecture decisions across platforms
  • Leverages AI tools to accelerate design, coding, reviews, and troubleshooting while maintaining high quality


Skills and Experience

  • 12+ years of hands-on experience in application development (preferably on .NET stack)
  • Experience leading technical direction while remaining hands-on
  • Deep expertise in .NET Framework (4.x) and modern .NET (.NET Core / .NET 6+)
  • Must have lead a project to modernize legacy system – preferably moving from .NET Framework to .NET Core.
  • Experience with MVC, Web Forms, and legacy UI patterns
  • Solid backend and API design experience
  • Strong understanding of database design and schema evolution
  • Understanding of Analytical systems – OLAP, Data warehousing, data lakes.
  • Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.
  • Integration with Zoho would be a plus.
Read more
Cansvolution
Pooja Rawat
Posted by Pooja Rawat
Indore
2 - 5 yrs
₹5L - ₹12L / yr
skill icon.NET
skill iconAngular (2+)
skill iconReact.js
ASP.NET
SQL
+4 more

About Cansvolution

Cansvolution is a growing IT services and product-based company based in Indore, M.P. We work with clients across industries, delivering scalable web and digital solutions. Our team focuses on innovation, practical problem-solving, and building technology that creates real business impact. We offer a collaborative work culture, hands-on learning, and strong growth opportunities for our employees.


Position: .NET Developer

Experience Required: Minimum 2+ Years

Location: Indore (Work From Office)

Joining: Immediate joiners preferred


Key Responsibilities

Design, develop, and maintain web applications using .NET technologies

Work on front-end development using React JS or Angular

Build and consume RESTful APIs

Collaborate with cross-functional teams including designers and backend developers

Debug, troubleshoot, and improve application performance

Participate in code reviews and follow best development practices


Required Skills

Strong experience in ASP.NET / .NET Core

Hands-on expertise in React JS or Angular

Good understanding of HTML, CSS, JavaScript

Experience with SQL databases

Knowledge of API integration

Understanding of software development lifecycle.


Preferred Skills

Experience working in Agile environments

Knowledge of version control tools like Git

Strong analytical and problem-solving abilities

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Pune
3 - 8 yrs
₹12L - ₹25L / yr
Large Language Models (LLM)
Retrieval Augmented Generation (RAG)
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
Software Testing (QA)
+9 more

Job Title : QA Lead (AI/ML Products)

Employment Type : Full Time

Experience : 4 to 8 Years

Location : On-site

Mandatory Skills : Strong hands-on experience in testing AI/ML (LLM, RAG) applications with deep expertise in API testing, SQL/NoSQL database validation, and advanced backend functional testing.


Role Overview :

We are looking for an experienced QA Lead who can own end-to-end quality for AI-influenced products and backend-heavy systems. This role requires strong expertise in advanced functional testing, API validation, database verification, and AI model behavior testing in non-deterministic environments.


Key Responsibilities :

  • Define and implement comprehensive test strategies aligned with business and regulatory goals.
  • Validate AI/ML and LLM-driven applications, including RAG pipelines, hallucination checks, prompt injection scenarios, and model response validation.
  • Perform deep API testing using Postman/cURL and validate JSON/XML payloads.
  • Execute complex SQL queries (MySQL/PostgreSQL) and work with MongoDB for backend and data integrity validation.
  • Analyze server logs and transactional flows to debug issues and ensure system reliability.
  • Conduct risk analysis and report key QA metrics such as defect leakage and release readiness.
  • Establish and refine QA processes, templates, standards, and agile testing practices.
  • Identify performance bottlenecks and basic security vulnerabilities (e.g., IDOR, data exposure).
  • Collaborate closely with developers, product managers, and domain experts to translate business requirements into testable scenarios.
  • Own feature quality independently from conception to release.

Required Skills & Experience :

  • 4+ years of hands-on experience in software testing and QA.
  • Strong understanding of testing AI/ML products, LLM validation, and non-deterministic behavior testing.
  • Expertise in API Testing, server log analysis, and backend validation.
  • Proficiency in SQL (MySQL/PostgreSQL) and MongoDB.
  • Deep knowledge of SDLC and Bug Life Cycle.
  • Strong problem-solving ability and structured approach to ambiguous scenarios.
  • Awareness of performance testing and basic security testing practices.
  • Excellent communication skills to articulate defects and QA strategies.

What We’re Looking For :

A proactive QA professional who can go beyond UI testing, understands backend systems deeply, and can confidently test modern AI-driven applications while driving quality standards across the team.

Read more
Auxo AI
Kritika Dhingra
Posted by Kritika Dhingra
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
2 - 8 yrs
₹10L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Data Transformation Tool (DBT)
SQL
skill iconPython
Spark
+1 more

AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.


Location : Bangalore, Hyderabad, Mumbai, and Gurgaon


Responsibilities:

· Designing, building, and operating scalable on-premises or cloud data architecture

· Analyzing business requirements and translating them into technical specifications

· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)

· Design, develop, and maintain scalable data pipelines and ETL processes

· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.

· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness

· Implement data governance and security best practices to ensure compliance and data integrity

· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring

· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.


Requirements


· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines

· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks

· Comprehensive understanding of the Snowflake and Databricks ecosystem

· Strong programming skills in languages like SQL and Python or PySpark.

· Experience with data modeling, ETL processes, and data warehousing concepts.

· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.


Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 8 yrs
₹26L - ₹35L / yr
skill iconPython
skill iconJava
SQL
FastAPI
skill iconDjango
+5 more

Review Criteria

  • Strong Senior Backend Engineer profiles
  • Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
  • Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
  • Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
  • Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
  • Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
  • Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
  • (Company) – Must have worked in product companies / startups, preferably Series A to Series D
  • (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred

 

Role & Responsibilities

As a Founding Engineer at company, you'll join our engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.

This role is perfect for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems need creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.

 

Key Responsibilities-

  • Build core platform features: Develop robust APIs, services, and integrations that power company’s billing automation and revenue recognition capabilities
  • Work across the full stack: Contribute to both backend services and frontend interfaces, ensuring seamless user experiences
  • Implement critical integrations: Connect company with external systems including CRMs, data warehouses, ERPs, and payment processors
  • Optimize for scale: Build systems that handle complex pricing models, high-volume usage data, and real-time financial calculations
  • Drive quality and best practices: Write clean, maintainable code while participating in code reviews and architectural discussions
  • Solve complex problems: Debug issues across the stack and work closely with teams to address evolving client needs

 

The Impact You'll Make-

  • Power business growth: Your code will directly enable billing and revenue operations for fast-growing B2B companies, helping them scale without operational bottlenecks
  • Build critical financial infrastructure: Contribute to systems handling millions in transactions while ensuring accurate, compliant revenue recognition
  • Shape product direction: Join during our scaling phase where your contributions immediately impact product evolution and customer success
  • Accelerate your expertise: Gain deep knowledge in financial systems, B2B SaaS operations, and enterprise software while working with industry veterans
  • Drive the future of B2B commerce: Help create infrastructure powering next-generation pricing models from usage-based to value-based billing.

 

 

Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Trivandrum, Thiruvananthapuram
5 - 7 yrs
₹18L - ₹26L / yr
skill iconKotlin
skill iconJava
skill iconAmazon Web Services (AWS)
skill iconSpring Boot
Microservices
+24 more

JOB DETAILS:

* Job Title: Lead I - Software Engineering-Kotlin, Java, Spring Boot, Aws

* Industry: Global digital transformation solutions provide

* Salary: Best in Industry

* Experience: 5 -7 years

* Location: Trivandrum, Thiruvananthapuram

Role Proficiency:

Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities

 

Skill Examples:

  1. Explain and communicate the design / development to the customer
  2. Perform and evaluate test results against product specifications
  3. Break down complex problems into logical components
  4. Develop user interfaces business software components
  5. Use data models
  6. Estimate time and effort required for developing / debugging features / components
  7. Perform and evaluate test in the customer or target environment
  8. Make quick decisions on technical/project related challenges
  9. Manage a Team mentor and handle people related issues in team
  10.  Maintain high motivation levels and positive dynamics in the team.
  11.  Interface with other teams’ designers and other parallel practices
  12.  Set goals for self and team. Provide feedback to team members
  13.  Create and articulate impactful technical presentations
  14.  Follow high level of business etiquette in emails and other business communication
  15.  Drive conference calls with customers addressing customer questions
  16.   Proactively ask for and offer help
  17.  Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks.
  18.  Build confidence with customers by meeting the deliverables on time with quality.
  19.  Estimate time and effort resources required for developing / debugging features / components
  20.  Make on appropriate utilization of Software / Hardware’s.
  21.  Strong analytical and problem-solving abilities

 

Knowledge Examples:

  •     Appropriate software programs / modules
  1. Functional and technical designing
  2. Programming languages – proficient in multiple skill clusters
  3. DBMS
  4. Operating Systems and software platforms
  5. Software Development Life Cycle
  6. Agile – Scrum or Kanban Methods
  7. Integrated development environment (IDE)
  8. Rapid application development (RAD)
  9. Modelling technology and languages
  10. Interface definition languages (IDL)
  11. Knowledge of customer domain and deep understanding of sub domain where problem is solved

 

Additional Comments:

We are seeking an experienced Senior Backend Engineer with strong expertise in Kotlin and Java to join our dynamic engineering team.

The ideal candidate will have a deep understanding of backend frameworks, cloud technologies, and scalable microservices architectures, with a passion for clean code, resilience, and system observability.

You will play a critical role in designing, developing, and maintaining core backend services that power our high-availability e-commerce and promotion platforms.

 

Key Responsibilities

Design, develop, and maintain backend services using Kotlin (JVM, Coroutines, Serialization) and Java.

Build robust microservices with Spring Boot and related Spring ecosystem components (Spring Cloud, Spring Security, Spring Kafka, Spring Data).

Implement efficient serialization/deserialization using Jackson and Kotlin Serialization. Develop, maintain, and execute automated tests using JUnit 5, Mockk, and ArchUnit to ensure code quality.

Work with Kafka Streams (Avro), Oracle SQL (JDBC, JPA), DynamoDB, and Redis for data storage and caching needs. Deploy and manage services in AWS environment leveraging DynamoDB, Lambdas, and IAM.

Implement CI/CD pipelines with GitLab CI to automate build, test, and deployment processes.

Containerize applications using Docker and integrate monitoring using Datadog for tracing, metrics, and dashboards.

Define and maintain infrastructure as code using Terraform for services including GitLab, Datadog, Kafka, and Optimizely.

Develop and maintain RESTful APIs with OpenAPI (Swagger) and JSON API standards.

Apply resilience patterns using Resilience4j to build fault-tolerant systems.

Adhere to architectural and design principles such as Domain-Driven Design (DDD), Object-Oriented Programming (OOP), and Contract Testing (Pact).

Collaborate with cross-functional teams in an Agile Scrum environment to deliver high-quality features.

Utilize feature flagging tools like Optimizely to enable controlled rollouts.

 

Mandatory Skills & Technologies Languages:

Kotlin (JVM, Coroutines, Serialization),

Java Frameworks: Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data)

Serialization: Jackson, Kotlin Serialization

Testing: JUnit 5, Mockk, ArchUnit

Data: Kafka (Avro) Streams Oracle SQL (JDBC, JPA) DynamoDB (NoSQL) Redis (Caching)

Cloud: AWS (DynamoDB, Lambda, IAM)

CI/CD: GitLab CI Containers: Docker

Monitoring & Observability: Datadog (Tracing, Metrics, Dashboards, Monitors)

Infrastructure as Code: Terraform (GitLab, Datadog, Kafka, Optimizely)

API: OpenAPI (Swagger), REST API, JSON API

Resilience: Resilience4j

Architecture & Practices: Domain-Driven Design (DDD) Object-Oriented Programming (OOP) Contract Testing (Pact) Feature Flags (Optimizely)

Platforms: E-Commerce Platform (CommerceTools), Promotion Engine (Talon.One)

Methodologies: Scrum, Agile

 

Skills: Kotlin, Java, Spring Boot, Aws

 

Must-Haves

Kotlin (JVM, Coroutines, Serialization), Java, Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data), AWS (DynamoDB, Lambda, IAM), Microservices Architecture

 

 

******

Notice period - 0 to 15 days only

Job stability is mandatory

Location: Trivandrum

Virtual Weekend Interview on 7th Feb 2026 - Saturday

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort