50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
Review Criteria
- Strong Senior Backend Engineer profiles
- Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
- Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
- Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
- Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
- Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
- Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
- (Company) – Must have worked in product companies / startups, preferably Series A to Series D
- (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred
Role & Responsibilities
As a Founding Engineer at company, you'll join our engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.
This role is perfect for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems need creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.
Key Responsibilities-
- Build core platform features: Develop robust APIs, services, and integrations that power company’s billing automation and revenue recognition capabilities
- Work across the full stack: Contribute to both backend services and frontend interfaces, ensuring seamless user experiences
- Implement critical integrations: Connect company with external systems including CRMs, data warehouses, ERPs, and payment processors
- Optimize for scale: Build systems that handle complex pricing models, high-volume usage data, and real-time financial calculations
- Drive quality and best practices: Write clean, maintainable code while participating in code reviews and architectural discussions
- Solve complex problems: Debug issues across the stack and work closely with teams to address evolving client needs
The Impact You'll Make-
- Power business growth: Your code will directly enable billing and revenue operations for fast-growing B2B companies, helping them scale without operational bottlenecks
- Build critical financial infrastructure: Contribute to systems handling millions in transactions while ensuring accurate, compliant revenue recognition
- Shape product direction: Join during our scaling phase where your contributions immediately impact product evolution and customer success
- Accelerate your expertise: Gain deep knowledge in financial systems, B2B SaaS operations, and enterprise software while working with industry veterans
- Drive the future of B2B commerce: Help create infrastructure powering next-generation pricing models from usage-based to value-based billing.
JOB DETAILS:
* Job Title: Lead I - Software Engineering-Kotlin, Java, Spring Boot, Aws
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 5 -7 years
* Location: Trivandrum, Thiruvananthapuram
Role Proficiency:
Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities
Skill Examples:
- Explain and communicate the design / development to the customer
- Perform and evaluate test results against product specifications
- Break down complex problems into logical components
- Develop user interfaces business software components
- Use data models
- Estimate time and effort required for developing / debugging features / components
- Perform and evaluate test in the customer or target environment
- Make quick decisions on technical/project related challenges
- Manage a Team mentor and handle people related issues in team
- Maintain high motivation levels and positive dynamics in the team.
- Interface with other teams’ designers and other parallel practices
- Set goals for self and team. Provide feedback to team members
- Create and articulate impactful technical presentations
- Follow high level of business etiquette in emails and other business communication
- Drive conference calls with customers addressing customer questions
- Proactively ask for and offer help
- Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks.
- Build confidence with customers by meeting the deliverables on time with quality.
- Estimate time and effort resources required for developing / debugging features / components
- Make on appropriate utilization of Software / Hardware’s.
- Strong analytical and problem-solving abilities
Knowledge Examples:
- Appropriate software programs / modules
- Functional and technical designing
- Programming languages – proficient in multiple skill clusters
- DBMS
- Operating Systems and software platforms
- Software Development Life Cycle
- Agile – Scrum or Kanban Methods
- Integrated development environment (IDE)
- Rapid application development (RAD)
- Modelling technology and languages
- Interface definition languages (IDL)
- Knowledge of customer domain and deep understanding of sub domain where problem is solved
Additional Comments:
We are seeking an experienced Senior Backend Engineer with strong expertise in Kotlin and Java to join our dynamic engineering team.
The ideal candidate will have a deep understanding of backend frameworks, cloud technologies, and scalable microservices architectures, with a passion for clean code, resilience, and system observability.
You will play a critical role in designing, developing, and maintaining core backend services that power our high-availability e-commerce and promotion platforms.
Key Responsibilities
Design, develop, and maintain backend services using Kotlin (JVM, Coroutines, Serialization) and Java.
Build robust microservices with Spring Boot and related Spring ecosystem components (Spring Cloud, Spring Security, Spring Kafka, Spring Data).
Implement efficient serialization/deserialization using Jackson and Kotlin Serialization. Develop, maintain, and execute automated tests using JUnit 5, Mockk, and ArchUnit to ensure code quality.
Work with Kafka Streams (Avro), Oracle SQL (JDBC, JPA), DynamoDB, and Redis for data storage and caching needs. Deploy and manage services in AWS environment leveraging DynamoDB, Lambdas, and IAM.
Implement CI/CD pipelines with GitLab CI to automate build, test, and deployment processes.
Containerize applications using Docker and integrate monitoring using Datadog for tracing, metrics, and dashboards.
Define and maintain infrastructure as code using Terraform for services including GitLab, Datadog, Kafka, and Optimizely.
Develop and maintain RESTful APIs with OpenAPI (Swagger) and JSON API standards.
Apply resilience patterns using Resilience4j to build fault-tolerant systems.
Adhere to architectural and design principles such as Domain-Driven Design (DDD), Object-Oriented Programming (OOP), and Contract Testing (Pact).
Collaborate with cross-functional teams in an Agile Scrum environment to deliver high-quality features.
Utilize feature flagging tools like Optimizely to enable controlled rollouts.
Mandatory Skills & Technologies Languages:
Kotlin (JVM, Coroutines, Serialization),
Java Frameworks: Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data)
Serialization: Jackson, Kotlin Serialization
Testing: JUnit 5, Mockk, ArchUnit
Data: Kafka (Avro) Streams Oracle SQL (JDBC, JPA) DynamoDB (NoSQL) Redis (Caching)
Cloud: AWS (DynamoDB, Lambda, IAM)
CI/CD: GitLab CI Containers: Docker
Monitoring & Observability: Datadog (Tracing, Metrics, Dashboards, Monitors)
Infrastructure as Code: Terraform (GitLab, Datadog, Kafka, Optimizely)
API: OpenAPI (Swagger), REST API, JSON API
Resilience: Resilience4j
Architecture & Practices: Domain-Driven Design (DDD) Object-Oriented Programming (OOP) Contract Testing (Pact) Feature Flags (Optimizely)
Platforms: E-Commerce Platform (CommerceTools), Promotion Engine (Talon.One)
Methodologies: Scrum, Agile
Skills: Kotlin, Java, Spring Boot, Aws
Must-Haves
Kotlin (JVM, Coroutines, Serialization), Java, Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data), AWS (DynamoDB, Lambda, IAM), Microservices Architecture
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Trivandrum
Virtual Weekend Interview on 7th Feb 2026 - Saturday

🚀 Hiring: Associate Tech Architect / Senior Tech Specialist
🌍 Remote | Contract Opportunity
We’re looking for a seasoned tech professional who can lead the design and implementation of cloud-native data and platform solutions. This is a remote, contract-based role for someone with strong ownership and architecture experience.
🔴 Mandatory & Most Important Skill Set
Hands-on expertise in the following technologies is essential:
✅ AWS – Cloud architecture & services
✅ Python – Backend & data engineering
✅ Terraform – Infrastructure as Code
✅ Airflow – Workflow orchestration
✅ SQL – Data processing & querying
✅ DBT – Data transformation & modeling
💼 Key Responsibilities
- Architect and build scalable AWS-based data platforms
- Design and manage ETL/ELT pipelines
- Orchestrate workflows using Airflow
- Implement cloud infrastructure using Terraform
- Lead best practices in data architecture, performance, and scalability
- Collaborate with engineering teams and provide technical leadership
🎯 Ideal Profile
✔ Strong experience in cloud and data platform architecture
✔ Ability to take end-to-end technical ownership
✔ Comfortable working in a remote, distributed team environment
📄 Role Type: Contract
🌍 Work Mode: 100% Remote
If you have deep expertise in these core technologies and are ready to take on a high-impact architecture role, we’d love to hear from you.
JOB DETAILS:
* Job Title: Associate III - Azure Data Engineer
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4 -6 years
* Location: Trivandrum, Kochi
Job Description: Azure Data Engineer (4–6 Years Experience)
Job Type: Full-time
Locations: Kochi, Trivandrum
Must-Have Skills
Azure & Data Engineering
- Azure Data Factory (ADF)
- Azure Databricks (PySpark)
- Azure Synapse Analytics
- Azure Data Lake Storage Gen2
- Azure SQL Database
Programming & Querying
- Python (PySpark)
- SQL / Spark SQL
Data Modelling
- Star & Snowflake schema
- Dimensional modelling
Source Systems
- SQL Server
- Oracle
- SAP
- REST APIs
- Flat files (CSV, JSON, XML)
CI/CD & Version Control
- Git
- Azure DevOps / GitHub Actions
Monitoring & Scheduling
- ADF triggers
- Databricks jobs
- Log Analytics
Security
- Managed Identity
- Azure Key Vault
- Azure RBAC / Access Control
Soft Skills
- Strong analytical & problem-solving skills
- Good communication and collaboration
- Ability to work in Agile/Scrum environments
- Self-driven and proactive
Good-to-Have Skills
- Power BI basics
- Delta Live Tables
- Synapse Pipelines
- Real-time processing (Event Hub / Stream Analytics)
- Infrastructure as Code (Terraform / ARM templates)
- Data governance tools like Azure Purview
- Azure Data Engineer Associate (DP-203) certification
Educational Qualifications
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Skills: Azure Data Factory, Azure Databricks, Azure Synapse, Azure Data Lake Storage
Must-Haves
Azure Data Factory (4-6 years), Azure Databricks/PySpark (4-6 years), Azure Synapse Analytics (4-6 years), SQL/Spark SQL (4-6 years), Git/Azure DevOps (4-6 years)
Skills: Azure, Azure data factory, Python, Pyspark, Sql, Rest Api, Azure Devops
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
JOB DETAILS:
* Job Title: Associate III - Data Engineering
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4-6 years
* Location: Trivandrum, Kochi
Job Description
Job Title:
Data Services Engineer – AWS & Snowflake
Job Summary:
As a Data Services Engineer, you will be responsible for designing, developing, and maintaining robust data solutions using AWS cloud services and Snowflake.
You will work closely with cross-functional teams to ensure data is accessible, secure, and optimized for performance.
Your role will involve implementing scalable data pipelines, managing data integration, and supporting analytics initiatives.
Responsibilities:
• Design and implement scalable and secure data pipelines on AWS and Snowflake (Star/Snowflake schema)
• Optimize query performance using clustering keys, materialized views, and caching
• Develop and maintain Snowflake data warehouses and data marts.
• Build and maintain ETL/ELT workflows using Snowflake-native features (Snowpipe, Streams, Tasks).
• Integrate Snowflake with cloud platforms (AWS, Azure, GCP) and third-party tools (Airflow, dbt, Informatica)
• Utilize Snowpark and Python/Java for complex transformations
• Implement RBAC, data masking, and row-level security.
• Optimize data storage and retrieval for performance and cost-efficiency.
• Collaborate with stakeholders to gather data requirements and deliver solutions.
• Ensure data quality, governance, and compliance with industry standards.
• Monitor, troubleshoot, and resolve data pipeline and performance issues.
• Document data architecture, processes, and best practices.
• Support data migration and integration from various sources.
Qualifications:
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• 3 to 4 years of hands-on experience in data engineering or data services.
• Proven experience with AWS data services (e.g., S3, Glue, Redshift, Lambda).
• Strong expertise in Snowflake architecture, development, and optimization.
• Proficiency in SQL and Python for data manipulation and scripting.
• Solid understanding of ETL/ELT processes and data modeling.
• Experience with data integration tools and orchestration frameworks.
• Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
• AWS Glue, AWS Lambda, Amazon Redshift
• Snowflake Data Warehouse
• SQL & Python
Skills: Aws Lambda, AWS Glue, Amazon Redshift, Snowflake Data Warehouse
Must-Haves
AWS data services (4-6 years), Snowflake architecture (4-6 years), SQL (proficient), Python (proficient), ETL/ELT processes (solid understanding)
Skills: AWS, AWS lambda, Snowflake, Data engineering, Snowpipe, Data integration tools, orchestration framework
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
JOB DETAILS:
* Job Title: Lead I - (Web Api, C# .Net, .Net Core, Aws (Mandatory)
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 6 -9 years
* Location: Hyderabad
Job Description
Role Overview
We are looking for a highly skilled Senior .NET Developer who has strong experience in building scalable, high‑performance backend services using .NET Core and C#, with hands‑on expertise in AWS cloud services. The ideal candidate should be capable of working in an Agile environment, collaborating with cross‑functional teams, and contributing to both design and development. Experience with React and Datadog monitoring tools will be an added advantage.
Key Responsibilities
- Design, develop, and maintain backend services and APIs using .NET Core and C#.
- Work with AWS services (Lambda, S3, ECS/EKS, API Gateway, RDS, etc.) to build cloud‑native applications.
- Collaborate with architects and senior engineers on solution design and implementation.
- Write clean, scalable, and well‑documented code.
- Use Postman to build and test RESTful APIs.
- Participate in code reviews and provide technical guidance to junior developers.
- Troubleshoot and optimize application performance.
- Work closely with QA, DevOps, and Product teams in an Agile setup.
- (Optional) Contribute to frontend development using React.
- (Optional) Use Datadog for monitoring, logging, and performance metrics.
Required Skills & Experience
- 6+ years of experience in backend development.
- Strong proficiency in C# and .NET Core.
- Experience building RESTful services and microservices.
- Hands‑on experience with AWS cloud platform.
- Solid understanding of API testing using Postman.
- Knowledge of relational databases (SQL Server, PostgreSQL, etc.).
- Strong problem‑solving and debugging skills.
- Experience working in Agile/Scrum teams.
Good to Have
- Experience with React for frontend development.
- Exposure to Datadog for monitoring and logging.
- Knowledge of CI/CD tools (GitHub Actions, Jenkins, AWS CodePipeline, etc.).
- Containerization experience (Docker, Kubernetes).
Soft Skills
- Strong communication and collaboration abilities.
- Ability to work in a fast‑paced environment.
- Ownership mindset with a focus on delivering high‑quality solutions.
Skills
.NET Core, C#, AWS, Postman
Notice period - 0 to 15 days only
Location: Hyderabad
Virtual Interview: 7th Feb 2026
First round will be Virtual
2nd round will be F2F
Position Overview
We are seeking an experienced ERPNext/Frappe Developer to join our dynamic team at Dhwani. The ideal candidate will have strong expertise in developing, customizing, and maintaining ERPNext applications built on the Frappe Framework. This role involves working on complex business solutions, custom module development, and ensuring seamless integration with various business processes.
Key Responsibilities
Development & Customization
- Design, develop, and implement custom applications and modules on the Frappe Framework and ERPNext.
- Customize existing ERPNext modules (Accounting, CRM, HR, Inventory, Manufacturing, etc.) to meet specific business requirements.
- Build custom DocTypes, forms, reports, dashboards, and print formats.
- Develop and maintain REST APIs for system integrations.
- Write clean, efficient, and well-documented code in Python and JavaScript.
Technical Implementation
- Understand client requirements for ERPNext and suggest optimal technical solutions
- Handle all aspects of development including server-side, API, and client-side logic
- Implement business logic using Frappe's document lifecycle hooks and controllers
- Develop custom web portals, web pages, and web forms
- Ensure smooth transitions for customizations during Frappe/ERPNext upgrades
System Management
- Manage ERPNext installations, configurations, and deployments
- Perform system updates, upgrades, and maintenance
- Debug and troubleshoot technical issues, providing timely solutions
- Work with MariaDB/MySQL databases and write complex queries
- Implement and manage version control using Git
Collaboration & Documentation
- Collaborate with business analysts and stakeholders to gather and refine requirements
- Write functional and development specifications
- Participate in code reviews and contribute to development best practices
- Provide technical guidance and support to junior developers
Required Qualifications
Experience
- Minimum 2-4 years of hands-on experience with Frappe Framework and ERPNext development and customizations
- Proven track record of delivering live ERPNext projects that can be showcased
- Experience in customizing ERPNext modules across different business domains
- We are also open to hire Interns (With PPO Opportunity) who demonstrates strong DSA and coding fundamentals, good understanding of Python programming, knowledge and exposure of MySQL database, strong logical thinking, problem solving skills along with interest in working on frappe framework and enthusiasm to build challenging technology solutions for social impact. High-performing interns will receive a Pre-Placement Offer (PPO) based on performance. Internship will be of 3 months with monthly stipend in between 15k-20k based on interview performance.
Technical Skills
Core Technologies:
- Strong proficiency in Python programming
- Solid experience with JavaScript, HTML, CSS
- Working knowledge of Jinja templating.
- Experience with jQuery and Bootstrap framework
Frappe/ERPNext Expertise:
- Deep understanding of Frappe Framework architecture.
- Experience with DocType creation, customization, and management.
- Knowledge of Frappe's ORM, REST API capabilities, and hooks system.
- Understanding of ERPNext modules and business workflows
Database & Infrastructure:
- Proficient in MariaDB/MySQL database management.
- Experience with Linux operating systems.
- Knowledge of Git version control.
- Understanding of web server configurations and deployment.
Professional Skills
- Strong analytical and problem-solving abilities
- Excellent communication and collaboration skills
- Ability to work effectively in team environments
- Self-starter with ability to take ownership of projects
- Attention to detail and commitment to quality code
This is a work-from-office role in Gurgaon, Haryana
Job Description:
Exp Range - [5y to 10y]
Require Skills:
- Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
- Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
- Must Have – Experience working with models such as Random Forest, Kmeans clustering, BERT…
- Should Have – Exposure to querying warehouses and APIs
- Should Have – Experience with writing moderate to complex SQL queries
- Should Have – Experience analyzing and presenting data with BI tools or Excel
- Must Have – Very strong communication skills to work with technical and non-technical stakeholders in a global environment
Roles and Responsibilities:
- Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
- Analyze and present insights about the data and processes to Business Stakeholders
- Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
- Develop and deploy customized models on Production data sets to generate analytical insights and predictions
- Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
- Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
- Share knowledge and best practices with broader teams to make everyone aware and more productive.
Qualifications:
- Minimum bachelor's degree in engineering or computer applications or AI/Data science
- Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
- Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred
AccioJob is conducting a Walk-In Hiring Drive with Global IT Consulting for the position of Software Engineer.
To apply, register and select your slot here: https://go.acciojob.com/6ED2rL
Required Skills: DSA, SQL, OOPS
Eligibility:
Degree: BTech./BE
Branch: Computer Science/CSE/Other CS related branch, IT
Graduation Year: 2024, 2025
Work Details:
Work Location: Bangalore (Onsite)
CTC: ₹11.1 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre
Further Rounds (for shortlisted candidates only):
Coding Assignment, Technical Interview 1, Technical Interview 2, Technical Interview 3
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/eapv4u
👇 FAST SLOT BOOKING 👇
[ 📲 DOWNLOAD ACCIOJOB APP ]
Proficiency in Java 8+.
Solid understanding of REST APIs(Spring boot), microservices, databases (SQL/NoSQL), and caching systems like Redis/Aerospike.
Familiarity with cloud platforms (AWS, GCP, Azure) and DevOps tools (Docker, Kubernetes, CI/CD).
Good understanding of data structures, algorithms, and software design principles.
About the company:
At Estuate, more than 400 uniquely talented people work together, to provide the world with next-generation product engineering and IT enterprise services. We help companies reimagine their business for the digital age.
Incorporated in 2005 in Milpitas (CA), we have grown to become a global organization with a truly global vision. At Estuate, we bring together talent, experience, and technology to meet our customer’s needs. Our ‘Extreme Service’ culture helps us deliver extraordinary results.
Our key to success:
We are an ISO-certified organization present across four distinct global geographies. We cater to industry verticals such as BFSI, Healthcare & Pharma, Retail & E-Commerce, and ISVs/Startups, as well as having over 2,000 projects in our portfolio.
Our solution-oriented mindset fuels our offerings, including Platform Engineering, Business Apps, and Enterprise Security & GRC.
Our culture of oneness
At Estuate, we are committed to fostering an inclusive workplace that welcomes people from diverse social circumstances. Our diverse culture shapes our success stories. Our values unite us. And, our curiosity inspires our creativity. Now, if that sounds like the place you’d like to be, we look forward to hearing more from you.
Requirements:
Technical skills
- 8+ years of experience in a role Business or System or Functional Analyst;
- Proficient in writing User Stories, Use Cases, Functional and Non-Functional requirements, system diagrams, wireframes;
- Experience of working with Restful APIs (writing requirements, API usage);
- Experience in Microservices architecture;
- Experience of working with Agile methodologies (Scrum, Kanban);
- Knowledge of SQL;
- Knowledge of UML, BPMN;
- Understanding of key UX/UI practices and processes;
- Understanding of software development lifecycle;
- Understanding of architecture of WEB-based application;
- English Upper-Intermediate or higher.
Soft Skills
- Excellent communication and presentation skills;
- Proactiveness;
- Organized, detail-oriented with ability to keep overall solution in mind;
- Comfort working in a fast-paced environment, running concurrent projects and manage BA work with multiple stakeholders;
- Good time-management skills, ability to handle multitasking activities.
Good to haves
- Experience in enterprise software development or finance domain;
- Experience in delivery of desktop and web-applications;
- Experience of successful system integration project.
Responsibilities:
- Participation in discovery phases and workshops with Customer, covering key business and product requirements;
- Manage project scope, requirements management and their impact on existing requirements, defining dependencies on other teams;
- Creating business requirements, user stories, mockups, functional specifications and technical requirements (incl. flow diagrams, data mappings, examples);
- Close collaboration with development team (requirements presentation, backlog grooming, requirements change management, technical solution design together with Tech Lead, etc.);
- Regular communication with internal (Product, Account management, Business teams) and external stakeholders (Partners, Customers);
- Preparing UAT scenarios, validation cases;
- User Acceptance Testing;
- Demo for internal stakeholders;
- Creating documentation (user guides, technical guides, presentations).
Project Description:
Wireless Standard POS (Point-Of-Sales) is our retail management solution for the Telecom Market.
It provides thousands of retailers with features and functionalities they need to run their businesses effectively with full visibility and control into every aspect of sales and operations. It is simple to learn, easy to use and as operation grows, more features can be added on.
Our system can optimize and simplify all processes related to retail in this business area.
Few things that our product can do:
- Robust Online Reporting
- Repair Management Software
- 3rd Party Integrations
- Customer Connect Marketing
- Time and Attendance
- Carrier Commission Reconciliation
As a Business Analyst/ System Analyst, you will be the liaison between the lines of business and the Development team, have the opportunity to work on a very complex product with microservice architecture (50+ for now) and communicate with Product, QA, Developers, Architecture and Customer Support teams to help improve product quality.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design and develop backend components and sub-systems for large-scale platforms under guidance from senior engineers.
- Contribute to building and evolving the next-generation customer data platform.
- Write clean, efficient, and well-tested code with a focus on scalability and performance.
- Explore and experiment with modern technologies—especially open-source frameworks—
- and build small prototypes or proof-of-concepts.
- Use AI-assisted development tools to accelerate coding, testing, debugging, and learning while adhering to engineering best practices.
- Participate in code reviews, design discussions, and continuous improvement of the platform.
Qualifications:
- 0–2 years of experience (or strong academic/project background) in backend development with Java.
- Good fundamentals in algorithms, data structures, and basic performance optimizations.
- Bachelor’s or Master’s degree in Computer Science or IT (B.E / B.Tech / M.Tech / M.S) from premier institutes.
Technical Skill Set:
- Strong aptitude and analytical skills with emphasis on problem solving and clean coding.
- Working knowledge of SQL and NoSQL databases.
- Familiarity with unit testing frameworks and writing testable code is a plus.
- Basic understanding of distributed systems, messaging, or streaming platforms is a bonus.
AI-Assisted Engineering (LLM-Era Skills):
- Familiarity with modern AI coding tools such as Cursor, Claude Code, Codex, Windsurf, Opencode, or similar.
- Ability to use AI tools for code generation, refactoring, test creation, and learning new systems responsibly.
- Willingness to learn how to combine human judgment with AI assistance for high-quality engineering outcomes.
Soft Skills & Nice to Have
- Appreciation for technology and its ability to create real business value, especially in data and marketing platforms.
- Clear written and verbal communication skills.
- Strong ownership mindset and ability to execute in fast-paced environments.
- Prior internship or startup experience is a plus.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design, build, and own large-scale, distributed backend and platform systems.
- Drive architectural decisions for high-throughput, low-latency services with strong scalability and reliability guarantees.
- Build and evolve core components of a real-time Customer Data Platform, especially around data ingestion, streaming, and processing.
- Evaluate and adopt open-source and emerging platform technologies; build prototypes where needed.
- Own critical subsystems end-to-end, ensuring performance, maintainability, and operational excellence.
- Mentor junior engineers and uphold high standards through code and design reviews.
- Effectively use modern AI-assisted coding tools to accelerate development while maintaining engineering rigor.
- 4–6 years of strong backend/platform engineering experience with solid fundamentals in algorithms, data structures, and optimizations.
- Proven experience designing and operating production-grade distributed systems.
- B.E / B.Tech / M.Tech / M.S / MCA in Computer Science or equivalent from premier institutes.
Qualifications:
Technical Skills:
- Strong system and object-oriented design skills.
- Hands-on experience with SQL and NoSQL databases.
- Strong working knowledge of Kafka and streaming systems.
- Proficiency in Java, concurrency, and unit/integration testing.
- Familiarity with cloud-native environments (Kubernetes, CI/CD, observability).
AI-Assisted Engineering:
- Hands-on experience using modern AI coding platforms such as Opencode, Claude Code, Codex, Cursor, Windsurf, or similar.
- Ability to use AI tools for code generation, refactoring, testing, debugging, and design exploration responsibly.
Soft Skills & Nice to Have:
- Strong ownership mindset and ability to deliver in fast-paced environments.
- Clear written and verbal communication skills.
- Startup experience is a plus.
Key Responsibilities
- Architectural Leadership: Design end-to-end agentic frameworks using UiPath Agent Builder and Studio Web, moving processes away from rigid "if-then" logic to goal-oriented AI agents.
- Agentic UI Integration: Lead the implementation of UiPath Autopilot and Agentic Orchestration to handle dynamic UI changes, unstructured data, and complex human-in-the-loop escalations.
- Advanced AI Implementation: Deploy and fine-tune models within the UiPath AI Trust Layer, ensuring secure and governed use of LLMs (GPT-4, Claude, etc.) for real-time UI reasoning.
- Infrastructure & Governance: Define the "Agentic Operating Model," establishing guardrails for autonomous agent behavior, security protocols, and scalability within UiPath Orchestrator.
- Stakeholder Strategy: Partner with C-suite stakeholders to identify "Agent-First" opportunities that provide 10x ROI over traditional RPA.
- Mentorship: Lead a CoE (Center of Excellence), upskilling senior developers in prompt engineering, context grounding, and semantic automation.
Required Technical Skills
- Core Platform: Expert-level mastery of UiPath Studio, Orchestrator, and Cloud.
- Agentic Specialization: Proven experience with UiPath Agent Builder, Integration Service, and Document Understanding.
- AI/ML Integration: Deep understanding of AI Center, Vector Databases, and RAG (Retrieval-Augmented Generation) to provide agents with business context.
- Programming: Proficiency in .NET (C#) and Python for custom activity development and AI model interfacing.
- UI Automation: Expert knowledge of modern UI descriptors, Computer Vision, and handling "tricky" environments (Citrix, legacy SAP, mainframe).
Qualifications
- Experience: 10+ years in Software Development/Automation, with at least 5 years specifically in UiPath Architecture.
- Education: Bachelor’s or Master’s in Computer Science, AI, or a related field.
- Certifications: UiPath Advanced Developer (ARD) and UiPath Solution Architect certifications are mandatory. Certifications in AI/ML (Azure AI, AWS Machine Learning) are a significant plus.
- Mindset: A "fail-forward" approach to innovation, with the ability to prototype agentic solutions in fast-paced environments.
Microsoft Fabric, Power BI, Data modelling, ETL, Spark SQL
Remote work- 5-7 hours
450 Rs hourly charges
What You’ll Do:
We are looking for a Staff Operations Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.
This role will be in the Engineering Operations team and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.
- Serve as the Engineering interface between Analytics and Engineering teams.
- Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
- Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
- Build “mastered” versions of the data for Analytics-specific querying use cases.
- Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent.
- Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
- Implement DataOps practices.
- Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
- Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
- Operate between Engineers and Analysts to unify both practices for analytics insight creation.
Who You Are:
- 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
- Adept in market research methodologies and using data to deliver representative insights.
- Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
- Deep SQL experience is a must.
- Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
- English Language Fluency and proven success working with teams in the U.S.
- Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
- Experience working with public clouds like GCP/AWS.
- Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
- Proficient with SQL, Python or JVM-based language, Bash.
- Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc.
- Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
- Experience in debugging UI and Backend issues will be add on.
Hiring : Azure Data Engineer
Experience level: 5 yrs – 12yrs
Location : Bangalore
Work arrangement : On-site
Budget Range: Flexible
Mandatory Skill :
Self-Rating (7+ is must)
ADF, Databricks , Pyspark , SQL - Mandatory
Good to have :-
Delta Live table , Python , Team handling-
Manager ( 7+yrs exp) ,
Azure functions, Unity catalog, real-time streaming , Data pipelines
The Role
We are looking for a Senior/Lead Azure Data Engineer to join our team in Pune. You will be responsible for the end-to-end lifecycle of data solutions, from initial client requirement gathering and solution architecture design to leading the data engineering team through implementation. You will be the technical anchor for the project, ensuring that our data estates are scalable, governed, and high-performing.
Key Responsibilities
- Architecture & Design: Design robust data architectures using Microsoft Fabric and Azure Synapse, focusing on Medallion architecture and metadata-driven frameworks.
- End-to-End Delivery: Translate complex client business requirements into technical roadmaps and lead the team to deliver them on time.
- Data Governance: Implement and manage enterprise-grade governance, data discovery, and lineage using Microsoft Purview.
- Team Leadership: Act as the technical lead for the team, performing code reviews, mentoring junior engineers, and ensuring best practices in PySpark and SQL.
- Client Management: Interface directly with stakeholders to define project scope and provide technical consultancy.
What We’re Looking For
- 6+ Years in Data Engineering with at least 3+ years leading technical teams or designing architectures.
- Expertise in Microsoft Fabric/Synapse: Deep experience with Lakehouses, Warehouses, and Spark-based processing.
- Governance Specialist: Proven experience implementing Microsoft Purview for data cataloging, sensitivity labeling, and lineage.
- Technical Breadth: Strong proficiency in PySpark, SQL, and Data Factory. Familiarity with Infrastructure as Code (Bicep/Terraform) is a major plus.
Why Work with Us?
- Competitive Pay
- Flexible Hours
- Work on Microsoft’s latest (Fabric, Purview, Foundry) as a Designated Solutions Partner.
- High-Stakes Impact: Solve complex, client-facing problems for enterprise leaders
- Structured learning paths to help you master AI automation and Agentic AI.

Global digital transformation solutions provider.
JOB DETAILS:
Job Role: Lead I - Software Engineering - Java, Spring Boot, Microservices
Industry: Global digital transformation solutions provider
Work Mode: 3 days in office, Hybrid model.
Salary: Best in Industry
Experience: 5-7 years
Location: Trivandrum, Kochi, Thiruvananthapuram
Job Description
Job Title: Senior Java Developer Experience: 5+ years
Job Summary: We are looking for a Senior Java Developer with strong experience in Spring Boot and Microservices to work on high-performance applications for a leading financial services client. The ideal candidate will have deep expertise in Java backend development, cloud (preferably GCP), and strong problem-solving abilities.
Key Responsibilities:
• Develop and maintain Java-based microservices using Spring Boot
• Collaborate with Product Owners and teams to gather and review requirements
• Participate in design reviews, code reviews, and unit testing
• Ensure application performance, scalability, and security
• Contribute to solution architecture and design documentation
• Support Agile development processes including daily stand-ups and sprint planning
• Mentor junior developers and lead small modules or features
Required Skills:
• Java, Spring Boot, Microservices architecture
• GCP (or other cloud platforms like AWS)
• REST/SOAP APIs, Hibernate, SQL, Tomcat
• CI/CD tools: Jenkins, Bitbucket
• Agile methodologies (Scrum/Kanban)
• Unit testing (JUnit), debugging and troubleshooting
• Good communication and team leadership skills
Preferred Skills:
• Frontend familiarity (Angular, AJAX)
• Experience with API documentation tools (Swagger)
• Understanding of design patterns and UML
• Exposure to Confluence, Jira
Must-Haves
Java/J2EE (5+ years), Spring/Spring Boot (5+ years), Microservices (5+ years), AWS/GCP/Azure (mandatory), CI/CD (Jenkins, SonarQube, Git)
Mandatory Skills Required: Strong proficiency in Java, spring boot, microservices, GCP/AWS.
Experience Required: Minimum 5+ years of relevant experience
Java, Spring Boot, Microservices architecture
GCP (or other cloud platforms like AWS)
REST/SOAP APIs, Hibernate, SQL, Tomcat
CI/CD tools: Jenkins, Bitbucket
Agile methodologies (Scrum/Kanban)
Unit testing (JUnit), debugging and troubleshooting
Good communication and team leadership skills
Notice period - 0 to 15 days only (Immediate or candidates who are serving notice period and who can join by Feb)
Job stability is mandatory
Location: Trivandrum, Kochi
Virtual Interview: 31st Jan-Saturday
Nice to Haves
Frontend familiarity (Angular, AJAX)
Experience with API documentation tools (Swagger)
Understanding of design patterns and UML
Exposure to Confluence, Jira
Job Description:
Summary
The Data Engineer will be responsible for designing, developing, and maintaining the data infrastructure. They must have experience with SQL and Python.
Roles & Responsibilities:
● Collaborate with product, business, and engineering stakeholders to understand key metrics, data needs, and reporting pain points.
● Design, build, and maintain clean, scalable, and reliable data models using DBT.
● Write performant SQL and Python code to transform raw data into structured marts and reporting layers.
● Create dashboards using Tableau or similar tools.
● Work closely with data platform engineers, architects, and analysts to ensure data pipelines are resilient, well-governed, and high quality.
● Define and maintain source-of-truth metrics and documentation in the analytics layer.
● Partner with product engineering teams to understand new features and ensure appropriate
instrumentation and event collection.
● Drive reporting outcomes by building dashboards or working with BI teams to ensure timely delivery of insights.
● Help scale our analytics engineering practice by contributing to internal tooling, frameworks, and best practices.
Who You Are:
Experience : 3 to 4 years of experience in analytics/data engineering, with strong hands-on expertise in DBT, SQL, Python and dashboarding tools.
● Experience working with modern data stacks (e.g., Snowflake, BigQuery, Redshift, Airflow).
● Strong data modeling skills (dimensional, star/snowflake schema, data vault, etc.).
● Excellent communication and stakeholder management skills.
● Ability to work independently and drive business outcomes through data.
● Exposure to product instrumentation and working with event-driven data is a plus.
● Prior experience in a fast-paced, product-led company is preferred.
Deliver engaging classroom and/or online training sessions on topics including:
Python for Data Science Data Analytics using Excel and SQL
Statistics and Probability
Machine Learning and Deep Learning
Data Visualization using Power BI / Tableau
Create and update course materials, projects, assignments, and quizzes.
Provide hands-on training and real-world project guidance.
Evaluate student performance, provide constructive feedback, and track progress.
Stay updated with the latest trends, tools, and technologies in Data Science.
Mentor students during capstone projects and industry case studies.
Coordinate with the academic and operations team for batch planning and feedback.
Assist with the development of new courses and curriculum as needed.
We are seeking a Data Engineer with 3–4 years of relevant experience to join our team. The ideal candidate should have strong expertise in Python and SQL and be available to join immediately.
Location: Bangalore
Experience: 3–4 Years
Joining: Immediate Joiner preferred
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and data models
- Extract, transform, and load (ETL) data from multiple sources
- Write efficient and optimized SQL queries for data analysis and reporting
- Develop data processing scripts and automation using Python
- Ensure data quality, integrity, and performance across systems
- Collaborate with cross-functional teams to support business and analytics needs
- Troubleshoot data-related issues and optimize existing processes
Required Skills & Qualifications:
- 3–4 years of hands-on experience as a Data Engineer or similar role
- Strong proficiency in Python and SQL
- Experience working with relational databases and large datasets
- Good understanding of data warehousing and ETL concepts
- Strong analytical and problem-solving skills
- Ability to work independently and in a team-oriented environment
Preferred:
- Experience with cloud platforms or data tools (added advantage)
- Exposure to performance tuning and data optimization
We are Hiring ASP.NET MVC/Core Developers
Click Here to Apply : https://prishusoft.com/jobs/junior-aspnet-mvccore-professional
Experience Level
- 1–2 years of professional experience in web application development using ASP.NET MVC and ASP.NET Core.
Key Responsibilities
- Develop, maintain, and enhance web applications using ASP.NET MVC and ASP.NET Core.
- Write clean, scalable, and maintainable code following best practices.
- Design, develop, and integrate RESTful APIs with ASP.NET Web API.
- Collaborate with front-end developers and UI/UX designers to deliver exceptional user experiences.
- Work with MSSQL databases, including writing complex T-SQL queries, stored procedures, and optimizing performance.
- Participate in code reviews and contribute to technical discussions, architecture decisions, and performance improvements.
Technical Skills & Expertise
- Strong proficiency in ASP.NET MVC with at least 1 years of project experience.
- Good working knowledge of ASP.NET Core for modern application development.
- Solid skills in C#, JavaScript, and HTML.
- Experience with .NET Framework 4.5+.
- Hands-on experience with ASP.NET Web API development and consumption.
- Expertise in MSSQL (T-SQL, indexing, performance tuning).
Soft Skills
- Strong verbal and written communication skills.
- Collaborative team player with a willingness to share knowledge and contribute to team success.
Preferred / Bonus Skills
- Experience with Angular, React, or Vue.js for dynamic front-end development.
- Exposure to unit testing frameworks (e.g., Jasmine, Karma) for front-end applications.
- Understanding of DevOps practices and CI/CD pipelines.
- Familiarity with TypeScript for scalable JavaScript development.
Work Mode: Full-time On-site / Hybrid (Ahmedabad)
About the Role:
We are looking for a highly skilled Data Engineer with a strong foundation in Power BI, SQL, Python, and Big Data ecosystems to help design, build, and optimize end-to-end data solutions. The ideal candidate is passionate about solving complex data problems, transforming raw data into actionable insights, and contributing to data-driven decision-making across the organization.
Key Responsibilities:
Data Modelling & Visualization
- Build scalable and high-quality data models in Power BI using best practices.
- Define relationships, hierarchies, and measures to support effective storytelling.
- Ensure dashboards meet standards in accuracy, visualization principles, and timelines.
Data Transformation & ETL
- Perform advanced data transformation using Power Query (M Language) beyond UI-based steps.
- Design and optimize ETL pipelines using SQL, Python, and Big Data tools.
- Manage and process large-scale datasets from various sources and formats.
Business Problem Translation
- Collaborate with cross-functional teams to translate complex business problems into scalable, data-centric solutions.
- Decompose business questions into testable hypotheses and identify relevant datasets for validation.
Performance & Troubleshooting
- Continuously optimize performance of dashboards and pipelines for latency, reliability, and scalability.
- Troubleshoot and resolve issues related to data access, quality, security, and latency, adhering to SLAs.
Analytical Storytelling
- Apply analytical thinking to design insightful dashboards—prioritizing clarity and usability over aesthetics.
- Develop data narratives that drive business impact.
Solution Design
- Deliver wireframes, POCs, and final solutions aligned with business requirements and technical feasibility.
Required Skills & Experience:
- Minimum 3+ years of experience as a Data Engineer or in a similar data-focused role.
- Strong expertise in Power BI: data modeling, DAX, Power Query (M Language), and visualization best practices.
- Hands-on with Python and SQL for data analysis, automation, and backend data transformation.
- Deep understanding of data storytelling, visual best practices, and dashboard performance tuning.
- Familiarity with DAX Studio and Tabular Editor.
- Experience in handling high-volume data in production environments.
Preferred (Good to Have):
- Exposure to Big Data technologies such as:
- PySpark
- Hadoop
- Hive / HDFS
- Spark Streaming (optional but preferred)
Why Join Us?
- Work with a team that's passionate about data innovation.
- Exposure to modern data stack and tools.
- Flat structure and collaborative culture.
- Opportunity to influence data strategy and architecture decisions.
Database Programmer (SQL & Python)
Experience: 4 – 5 Years
Location: Remote
Employment Type: Full-Time
About the Opportunity
We are a mission-driven HealthTech organization dedicated to bridging the gap in global healthcare equity. By harnessing the power of AI-driven clinical insights and real-world evidence, we help healthcare providers and pharmaceutical companies deliver precision medicine to underrepresented populations.
We are looking for a skilled Database Programmer with a strong blend of SQL expertise and Python automation skills to help us manage, transform, and unlock the value of complex clinical data. This is a fully remote role where your work will directly contribute to improving patient outcomes and making life-saving treatments more affordable and accessible.
Key Responsibilities
- Data Architecture & Management: Design, develop, and maintain robust relational databases to store large-scale, longitudinal patient records and clinical data.
- Complex Querying: Write and optimize sophisticated SQL queries, stored procedures, and triggers to handle deep clinical datasets, ensuring high performance and data integrity.
- Python Automation: Develop Python scripts and ETL pipelines to automate data ingestion, cleaning, and transformation from diverse sources (EHRs, lab reports, and unstructured clinical notes).
- AI Support: Collaborate with Data Scientists to prepare datasets for AI-based analytics, Knowledge Graphs, and predictive modeling.
- Data Standardization: Map and transform clinical data into standardized models (such as HL7, FHIR, or proprietary formats) to ensure interoperability across healthcare ecosystems.
- Security & Compliance: Implement and maintain rigorous data security protocols, ensuring all database activities comply with global healthcare regulations (e.g., HIPAA, GDPR).
Required Skills & Qualifications
- Education: Bachelor’s degree in Computer Science, Information Technology, Statistics, or a related field.
- SQL Mastery: 4+ years of experience with relational databases (PostgreSQL, MySQL, or MS SQL Server). You should be comfortable with performance tuning and complex data modeling.
- Python Proficiency: Strong programming skills in Python, particularly for data manipulation (Pandas, NumPy) and database interaction (SQLAlchemy, Psycopg2).
- Healthcare Experience: Familiarity with healthcare data standards (HL7, FHIR) or experience working with Electronic Health Records (EHR) is highly preferred.
- ETL Expertise: Proven track record of building and managing end-to-end data pipelines for structured and unstructured data.
- Analytical Mindset: Ability to troubleshoot complex data issues and translate business requirements into efficient technical solutions.
To process your details please fill-out the google form.
About Company (GeniWay)
GeniWay Technologies is pioneering India’s first AI-native platform for personalized learning and career guidance, transforming the way students learn, grow, and determine their future path. Addressing challenges in the K-12 system such as one-size-fits-all teaching and limited career awareness, GeniWay leverages cutting-edge AI to create a tailored educational experience for every student. The core technology includes an AI-powered learning engine, a 24x7 multilingual virtual tutor and Clario, a psychometrics-backed career guidance system. Aligned with NEP 2020 policies, GeniWay is on a mission to make high-quality learning accessible to every student in India, regardless of their background or region.
What you’ll do
- Build the career assessment backbone: attempt lifecycle (create/resume/submit), timing metadata, partial attempts, idempotent APIs.
- Implement deterministic scoring pipelines with versioning and audit trails (what changed, when, why).
- Own Postgres data modeling: schemas, constraints, migrations, indexes, query performance.
- Create safe, structured GenAI context payloads (controlled vocabulary, safety flags, eval datasets) to power parent/student narratives.
- Raise reliability: tests for edge cases, monitoring, reprocessing/recalculation jobs, safe logging (no PII leakage).
Must-have skills
- Backend development in Python (FastAPI/Django/Flask) or Node (NestJS) with production API experience.
- Strong SQL + PostgreSQL fundamentals (transactions, indexes, schema design, migrations).
- Testing discipline: unit + integration tests for logic-heavy code; systematic debugging approach.
- Comfort using AI coding copilots to speed up scaffolding/tests/refactors — while validating correctness.
- Ownership mindset: cares about correctness, data integrity, and reliability.
Good to have
- Experience with rule engines, scoring systems, or audit-heavy domains (fintech, healthcare, compliance).
- Event schemas/telemetry pipelines and observability basics.
- Exposure to RAG/embeddings/vector DBs or prompt evaluation harnesses.
Location: Pune (on-site for first 3 months; hybrid/WFH flexibility thereafter)
Employment Type: Full-time
Experience: 2–3 years (correctness-first; strong learning velocity)
Compensation: Competitive (₹8–10 LPA fixed cash) + ESOP (equity ownership, founding-early employee level)
Joining Timeline: 2–3 weeks / Immediate
Why join (founding team)
- You’ll build core IP: scoring integrity and data foundations that everything else depends on.
- Rare skill-building: reliable systems + GenAI-safe context/evals (not just API calls).
- Meaningful ESOP upside at an early stage.
- High trust, high ownership, fast learning.
- High-impact mission: reduce confusion and conflict in student career decisions; help families make better choices, transform student lives by making great learning personal.
Hiring process (fast)
1. 20-min intro call (fit + expectations).
2. 45–60 min SQL & data modeling, API deep dive.
3. Practical exercise (2–3 hours max) implementing a small scoring service with tests.
4. Final conversation + offer.
How to apply
Reply with your resume/LinkedIn profile plus one example of a system/feature where you owned data modeling and backend integration (a short paragraph is fine).
Required Skills and Qualifications:
- 2–3 years of professional experience in Python development.
- Strong understanding of object-oriented programming.
- Experience with frameworks such as Django, Flask, or FastAPI.
- Knowledge of REST APIs, JSON, and web integration.
- Familiarity with SQL and database management systems.
- Experience with Git or other version control tools.
- Good problem-solving and debugging skills.
- Strong communication and teamwork abilities.
Position: Insights Manager
Location: Gurugram (Onsite)
Experience Required: 4+ Years
Working Days: 5 Days (Mon to Fri)
About the Role
We are seeking a hands-on Insights Manager to build the analytical backbone that powers decision-making. This role sits at the centre of the data ecosystem, partnering with Category, Commercial, Marketing, Sourcing, Fulfilment, Product, and Growth teams to translate data into insight, automation, and action.
You will design self-running reporting systems, maintain data quality in collaboration with data engineering, and build analytical models that directly improve pricing, customer experience, and operational efficiency. The role requires strong e-commerce domain understanding and the ability to move from data to decisions with speed and precision.
Key Responsibilities
1. Data Platform & Governance
- Partner with data engineering to ensure clean and reliable data across Shopify, GA4, Ad platforms, CRM, and ERP systems
- Define and maintain KPI frameworks (ATC, CVR, AOV, Repeat Rate, Refunds, LTV, CAC, etc.)
- Oversee pipeline monitoring, QA checks, and metric documentation
2. Reporting, Dashboards & Automation
- Build automated datamarts and dashboards for business teams
- Integrate APIs and automate data flows across multiple sources
- Create actionable visual stories and executive summaries
- Use AI and automation tools to improve insight delivery speed
3. Decision Models & Applied Analytics
- Build models for pricing, discounting, customer segmentation, inventory planning, delivery SLAs, and recommendations
- Translate analytics outputs into actionable playbooks for internal teams
4. Insights & Actionability
- Diagnose performance shifts and identify root causes
- Deliver weekly and monthly insight-driven recommendations
- Improve decision-making speed and quality across functions
Qualifications & Experience
- 4–7 years of experience in analytics or product insights (e-commerce / D2C / retail)
- Strong SQL and Python skills
- Hands-on experience with GA4, GTM, and dashboarding tools (Looker / Tableau / Power BI)
- Familiarity with CRM platforms like Klaviyo, WebEngage, or MoEngage
- Strong understanding of e-commerce KPIs and customer metrics
- Ability to communicate insights clearly to non-technical stakeholders
What We Offer
- Greenfield opportunity to build the data & insights platform from scratch
- High business impact across multiple functions
- End-to-end exposure from analytics to automation and applied modelling
- Fast-paced, transparent, and collaborative work culture
Company Description
NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.
Role Description
This is a full-time hybrid role for a Java Software Engineer, based in Pune. The Java Software Engineer will be responsible for designing, developing, and maintaining software applications. Key responsibilities include working with microservices architecture, implementing and managing the Spring Framework, and programming in Java. Collaboration with cross-functional teams to define, design, and ship new features is also a key aspect of this role.
Responsibilities:
● Develop and Maintain: Write clean, efficient, and maintainable code for Java-based applications
● Collaborate: Work with cross-functional teams to gather requirements and translate them into technical solutions
● Code Reviews: Participate in code reviews to maintain high-quality standards
● Troubleshooting: Debug and resolve application issues in a timely manner
● Testing: Develop and execute unit and integration tests to ensure software reliability
● Optimize: Identify and address performance bottlenecks to enhance application performance
Qualifications & Skills:
● Strong knowledge of Java, Spring Framework (Spring Boot, Spring MVC), and Hibernate/JPA
● Familiarity with RESTful APIs and web services
● Proficiency in working with relational databases like MySQL or PostgreSQL
● Practical experience with AWS cloud services and building scalable, microservices-based architectures
● Experience with build tools like Maven or Gradle
● Understanding of version control systems, especially Git
● Strong understanding of object-oriented programming principles and design patterns
● Familiarity with automated testing frameworks and methodologies
● Excellent problem-solving skills and attention to detail
● Strong communication skills and ability to work effectively in a collaborative team environment
Why Join Us?
● Opportunity to work on cutting-edge technology products
● A collaborative and learning-driven environment
● Exposure to AI and software engineering innovations
● Excellent work ethic and culture
If you're passionate about technology and want to work on impactful projects, we'd love to hear from you
Job Summary
We are looking for an experienced Python DBA with strong expertise in Python scripting and SQL/NoSQL databases. The candidate will be responsible for database administration, automation, performance optimization, and ensuring availability and reliability of database systems.
Key Responsibilities
- Administer and maintain SQL and NoSQL databases
- Develop Python scripts for database automation and monitoring
- Perform database performance tuning and query optimization
- Manage backups, recovery, replication, and high availability
- Ensure data security, integrity, and compliance
- Troubleshoot and resolve database-related issues
- Collaborate with development and infrastructure teams
- Monitor database health and performance
- Maintain documentation and best practices
Required Skills
- 10+ years of experience in Database Administration
- Strong proficiency in Python
- Experience with SQL databases (PostgreSQL, MySQL, Oracle, SQL Server)
- Experience with NoSQL databases (MongoDB, Cassandra, etc.)
- Strong understanding of indexing, schema design, and performance tuning
- Good analytical and problem-solving skills
Key Responsibilities
- Develop and maintain applications using Java 8/11/17, Spring Boot, and REST APIs.
- Design and implement microservices and backend components.
- Work with SQL/NoSQL databases, API integrations, and performance optimization.
- Collaborate with cross-functional teams and participate in code reviews.
- Deploy applications using CI/CD, Docker, Kubernetes, and cloud platforms (AWS/Azure/GCP).
Skills Required
- Strong in Core Java, OOPS, multithreading, collections.
- Hands-on with Spring Boot, Hibernate/JPA, Microservices.
- Experience with REST APIs, Git, and CI/CD pipelines.
- Knowledge of Docker/Kubernetes and cloud basics.
- Good understanding of database queries and performance tuning.
Nice to Have:
- Experience with messaging systems (Kafka/RabbitMQ).
- Basic frontend understanding (React/Angular).
Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.
Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.
Handling the challenges and breaking.
Must have Advanced Excel skills
Should have 3+ years of relevant experience
Should have Reporting + dashboard creation experience
Should have Database development & maintenance experience
Must have Strong communication for client interactions
Should have Ability to work independently
Willingness to work from client locati
Forbes Advisor is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.
We do this by combining data-driven content, rigorous product comparisons, and user-first design all built on top of a modern, scalable platform. Our teams operate globally and bring deep expertise across journalism, product, performance marketing, and analytics.
The Role
We are hiring a Senior Data Engineer to help design and scale the infrastructure behind our analytics,performance marketing, and experimentation platforms.
This role is ideal for someone who thrives on solving complex data problems, enjoys owning systems end-to-end, and wants to work closely with stakeholders across product, marketing, and analytics.
You’ll build reliable, scalable pipelines and models that support decision-making and automation at every level of the business.
What you’ll do
● Build, maintain, and optimize data pipelines using Spark, Kafka, Airflow, and Python
● Orchestrate workflows across GCP (GCS, BigQuery, Composer) and AWS-based systems
● Model data using dbt, with an emphasis on quality, reuse, and documentation
● Ingest, clean, and normalize data from third-party sources such as Google Ads, Meta,Taboola, Outbrain, and Google Analytics
● Write high-performance SQL and support analytics and reporting teams in self-serve data access
● Monitor and improve data quality, lineage, and governance across critical workflows
● Collaborate with engineers, analysts, and business partners across the US, UK, and India
What You Bring
● 4+ years of data engineering experience, ideally in a global, distributed team
● Strong Python development skills and experience
● Expert in SQL for data transformation, analysis, and debugging
● Deep knowledge of Airflow and orchestration best practices
● Proficient in DBT (data modeling, testing, release workflows)
● Experience with GCP (BigQuery, GCS, Composer); AWS familiarity is a plus
● Strong grasp of data governance, observability, and privacy standards
● Excellent written and verbal communication skills
Nice to have
● Experience working with digital marketing and performance data, including:
Google Ads, Meta (Facebook), TikTok, Taboola, Outbrain, Google Analytics (GA4)
● Familiarity with BI tools like Tableau or Looker
● Exposure to attribution models, media mix modeling, or A/B testing infrastructure
● Collaboration experience with data scientists or machine learning workflows
Why Join Us
● Monthly long weekends — every third Friday off
● Wellness reimbursement to support your health and balance
● Paid parental leave
● Remote-first with flexibility and trust
● Work with a world-class data and marketing team inside a globally recognized brand
About Kanerika:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Role Responsibilities:
Following are high level responsibilities that you will play but not limited to:
- Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
- Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
- Enable business analytics and self-service reporting through Power BI and other visualization tools.
- Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
- Implement and enforce best practices for data governance, data quality, and security.
- Mentor and guide junior data engineers; establish coding and design standards.
- Evaluate emerging technologies and tools to continuously improve the data ecosystem.
Required Qualifications:
- Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
- Bachelor’s/ Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- 7+ years of experience in data engineering or data platform development
- Strong hands-on experience in SQL, Snowflake, Python, and Airflow
- Solid understanding of data modeling, data governance, security, and CI/CD practices.
Preferred Qualifications:
- Familiarity with data modeling techniques and practices for Power BI.
- Knowledge of Azure Databricks or other data processing frameworks.
- Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Job Details
- Job Title: Java Full Stack Developer
- Industry: Global digital transformation solutions provider
- Domain: Information technology (IT)
- Experience Required: 5-7 years
- Working Mode: 3 days in office, Hybrid model.
- Job Location: Bangalore
- CTC Range: Best in Industry
Job Description:
SDET (Software Development Engineer in Test)
Job Responsibilities:
• Test Automation: • Develop, maintain, and execute automated test scripts using test automation frameworks. • Design and implement testing tools and frameworks to support automated testing.
• Software Development: • Participate in the design and development of software components to improve testability. • Write code actively, contribute to the development of tools, and work closely with developers to debunk complex issues.
• Quality Assurance: • Collaborate with the development team to understand software features and technical implementations. • Develop quality assurance standards and ensure adherence to the best testing practices.
• Integration Testing: • Conduct integration and functional testing to ensure that components work as expected individually and when combined.
• Performance and Scalability Testing: • Perform performance and scalability testing to identify bottlenecks and optimize application performance. • Test Planning and Execution: • Create detailed, comprehensive, and well-structured test plans and test cases. • Execute manual and/or automated tests and analyze results to ensure product quality.
• Bug Tracking and Resolution: • Identify, document, and track software defects using bug tracking tools. • Verify fixes and work closely with developers to resolve issues. • Continuous Improvement: • Stay updated on emerging tools and technologies relevant to the SDET role. • Constantly look for ways to improve testing processes and frameworks.
Skills and Qualifications: • Strong programming skills, particularly in languages such as COBOL, JCL, Java, C#, Python, or JavaScript. • Strong experience in Mainframe environments. • Experience with test automation tools and frameworks like Selenium, JUnit, TestNG, or Cucumber. • Excellent problem-solving skills and attention to detail. • Familiarity with CI/CD tools and practices, such as Jenkins, Git, Docker, etc. • Good understanding of web technologies and databases is often beneficial. • Strong communication skills for interfacing with cross-functional teams.
Qualifications • 5+ years of experience as a software developer, QA Engineer, or SDET. • 5+ years of hands-on experience with Java or Selenium. • 5+ years of hands-on experience with Mainframe environments. • 4+ years designing, implementing, and running test cases. • 4+ years working with test processes, methodologies, tools, and technology. • 4+ years performing functional and UI testing, quality reporting. • 3+ years of technical QA management experience leading on and offshore resources. • Passion around driving best practices in the testing space. • Thorough understanding of Functional, Stress, Performance, various forms of regression testing and mobile testing. • Knowledge of software engineering practices and agile approaches. • Experience building or improving test automation frameworks. • Proficiency CICD integration and pipeline development in Jenkins, Spinnaker or other similar tools. • Proficiency in UI automation (Serenity/Selenium, Robot, Watir). • Experience in Gherkin (BDD /TDD). • Ability to quickly tackle and diagnose issues within the quality assurance environment and communicate that knowledge to a varied audience of technical and non-technical partners. • Strong desire for establishing and improving product quality. • Willingness to take challenges head on while being part of a team. • Ability to work under tight deadlines and within a team environment. • Experience in test automation using UFT and Selenium. • UFT/Selenium experience in building object repositories, standard & custom checkpoints, parameterization, reusable functions, recovery scenarios, descriptive programming and API testing. • Knowledge of VBScript, C#, Java, HTML, and SQL. • Experience using GIT or other Version Control Systems. • Experience developing, supporting, and/or testing web applications. • Understanding of the need for testing of security requirements. • Ability to understand API – JSON and XML formats with experience using API testing tools like Postman, Swagger or SoapUI. • Excellent communication, collaboration, reporting, analytical and problem-solving skills. • Solid understanding of Release Cycle and QA /testing methodologies • ISTQB certification is a plus.
Skills: Python, Mainframe, C#
Notice period - 0 to 15days only
About the Role
We are looking for a motivated Full Stack Developer with 2–5 years of hands-on experience in building scalable web applications. You will work closely with senior engineers and product teams to develop new features, improve system performance, and ensure high-
quality code delivery.
Responsibilities
- Develop and maintain full-stack applications.
- Implement clean, maintainable, and efficient code.
- Collaborate with designers, product managers, and backend engineers.
- Participate in code reviews and debugging.
- Work with REST APIs/GraphQL.
- Contribute to CI/CD pipelines.
- Ability to work independently as well as within a collaborative team environment.
Required Technical Skills
- Strong knowledge of JavaScript/TypeScript.
- Experience with React.js, Next.js.
- Backend experience with Node.js, Express, NestJS.
- Understanding of SQL/NoSQL databases.
- Experience with Git, APIs, debugging tools.ß
- Cloud familiarity (AWS/GCP/Azure).
AI and System Mindset
Experience working with AI-powered systems is a strong plus. Candidates should be comfortable integrating AI agents, third-party APIs, and automation workflows into applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.
Soft Skills
- Strong problem-solving ability.
- Good communication and teamwork.
- Fast learner and adaptable.
Education
Bachelor's degree in Computer Science / Engineering or equivalent.
Key Responsibilities
• Understand customer product configurations and translate them into structured data using Windowmaker Software.
• Set up and modify profile data including reinforcements, glazing, and accessories, aligned with customer-specific rules and industry practices.
• Analyse data, identify inconsistencies, and ensure high-quality output that supports accurate quoting and manufacturing.
• Collaborate with cross-functional teams (Sales, Software Development, Support) to deliver complete and tested data setups on time.
• Provide training, guidance, and documentation to internal teams and customers as needed.
• Continuously look for process improvements and contribute to knowledge-sharing across the team.
• Support escalated customer cases related to data accuracy or configuration issues.
• Ensure timely delivery of all assigned tasks while maintaining high standards of quality and attention to detail.
Required Qualifications
• 3–5 years of experience in a data-centric role.
• Bachelor’s degree in engineering e.g Computer Science, or a related technical field.
• Experience with product data structures and product lifecycle.
• Strong analytical skills with a keen eye for data accuracy and patterns.
• Ability to break down complex product information into structured data elements.
• Eagerness to learn industry domain knowledge and software capabilities.
• Hands-on experience with Excel, SQL, or other data tools.
• Ability to manage priorities and meet deadlines in a fast-paced environment.
• Excellent written and verbal communication skills.
• A collaborative, growth-oriented mindset.
Job Overview
As a Profile Data Setup Analyst, you will play a key role in configuring, analysing, and managing product
data for our customers. You will work closely with internal teams and clients to ensure accurate,
optimized, and timely data setup in Windowmaker software. This role is perfect for someone who
enjoys problem-solving, working with data, and continuously learning.
Key Responsibilities
• Understand customer product configurations and translate them into structured data using
Windowmaker Software.
• Set up and modify profile data including reinforcements, glazing, and accessories, aligned with customer-specific rules and industry practices.
• Analyse data, identify inconsistencies, and ensure high-quality output that supports accurate quoting and manufacturing.
• Collaborate with cross-functional teams (Sales, Software Development, Support) to deliver complete and tested data setups on time.
• Provide training, guidance, and documentation to internal teams and customers as needed.
• Continuously look for process improvements and contribute to knowledge-sharing across the team.
• Support escalated customer cases related to data accuracy or configuration issues.
• Ensure timely delivery of all assigned tasks while maintaining high standards of quality and attention to detail.
Required Qualifications
• 3–5 years of experience in a data-centric role.
• Bachelor’s degree in engineering e.g Computer Science, or a related technical field.
• Experience with product data structures and product lifecycle.
• Strong analytical skills with a keen eye for data accuracy and patterns.
• Ability to break down complex product information into structured data elements.
• Eagerness to learn industry domain knowledge and software capabilities.
• Hands-on experience with Excel, SQL, or other data tools.
• Ability to manage priorities and meet deadlines in a fast-paced environment.
• Excellent written and verbal communication skills.
• A collaborative, growth-oriented mindset.
Nice to Have
• Prior exposure to ERP/CPQ/Manufacturing systems is a plus.
• Knowledge of the window and door (fenestration) industry is an added advantage.
Why Join Us
• Be part of a global product company with a solid industry reputation.
• Work on impactful projects that directly influence customer success.
• Collaborate with a talented, friendly, and supportive team.
• Learn, grow, and make a difference in the digital transformation of the fenestration industry.
About the Role
We're seeking a Python Backend Developer to join our insurtech analytics team. This role focuses on developing backend APIs, automating insurance reporting processes, and supporting data analysis tools. You'll work with insurance data, build REST APIs, and help streamline operational workflows through automation.
Key Responsibilities
- Automate insurance reporting processes including bordereaux, reconciliations, and data extraction from various file formats
- Support and maintain interactive dashboards and reporting tools for business stakeholders
- Develop Python scripts and applications for data processing, validation, and transformation
- Develop and maintain backend APIs using FastAPI or Flask
- Perform data analysis and generate insights from insurance datasets
- Automate recurring analytical and reporting tasks
- Work with SQL databases to query, analyze, and extract data
- Collaborate with business users to understand requirements and deliver solutions
- Document code, processes, and create user guides for dashboards and tools
- Support data quality initiatives and implement validation checks
Requirements
Essential
- 2+ years of Python development experience
- Strong knowledge of Python libraries: Pandas, NumPy for data manipulation
- Experience building web applications or dashboards with Python frameworks
- Knowledge of FastAPI or Flask for building backend APIs and applications
- Proficiency in SQL and working with relational databases
- Experience with data visualization libraries (Matplotlib, Plotly, Seaborn)
- Ability to work with Excel, CSV, and other data file formats
- Strong problem-solving and analytical thinking skills
- Good communication skills to work with non-technical stakeholders
Desirable
- Experience in insurance or financial services industry
- Familiarity with insurance reporting processes (bordereaux, reconciliations, claims data)
- Experience with Azure cloud services (Azure Functions, Blob Storage, SQL Database)
- Experience with version control systems (Git, GitHub, Azure DevOps)
- Experience with API development and RESTful services
Tech Stack
Python 3.x, FastAPI, Flask, Pandas, NumPy, Plotly, Matplotlib, SQL Server, MS Azure, Git, Azure DevOps, REST APIs, Excel/CSV processing libraries
Senior Full Stack Developer – Analytics Dashboard
Job Summary
We are seeking an experienced Full Stack Developer to design and build a scalable, data-driven analytics dashboard platform. The role involves developing a modern web application that integrates with multiple external data sources, processes large datasets, and presents actionable insights through interactive dashboards.
The ideal candidate should be comfortable working across the full stack and have strong experience in building analytical or reporting systems.
Key Responsibilities
- Design and develop a full-stack web application using modern technologies.
- Build scalable backend APIs to handle data ingestion, processing, and storage.
- Develop interactive dashboards and data visualisations for business reporting.
- Implement secure user authentication and role-based access.
- Integrate with third-party APIs using OAuth and REST protocols.
- Design efficient database schemas for analytical workloads.
- Implement background jobs and scheduled tasks for data syncing.
- Ensure performance, scalability, and reliability of the system.
- Write clean, maintainable, and well-documented code.
- Collaborate with product and design teams to translate requirements into features.
Required Technical Skills
Frontend
- Strong experience with React.js
- Experience with Next.js
- Knowledge of modern UI frameworks (Tailwind, MUI, Ant Design, etc.)
- Experience building dashboards using chart libraries (Recharts, Chart.js, D3, etc.)
Backend
- Strong experience with Node.js (Express or NestJS)
- REST and/or GraphQL API development
- Background job systems (cron, queues, schedulers)
- Experience with OAuth-based integrations
Database
- Strong experience with PostgreSQL
- Data modelling and performance optimisation
- Writing complex analytical SQL queries
DevOps / Infrastructure
- Cloud platforms (AWS)
- Docker and basic containerisation
- CI/CD pipelines
- Git-based workflows
Experience & Qualifications
- 5+ years of professional full stack development experience.
- Proven experience building production-grade web applications.
- Prior experience with analytics, dashboards, or data platforms is highly preferred.
- Strong problem-solving and system design skills.
- Comfortable working in a fast-paced, product-oriented environment.
Nice to Have (Bonus Skills)
- Experience with data pipelines or ETL systems.
- Knowledge of Redis or caching systems.
- Experience with SaaS products or B2B platforms.
- Basic understanding of data science or machine learning concepts.
- Familiarity with time-series data and reporting systems.
- Familiarity with meta ads/Google ads API
Soft Skills
- Strong communication skills.
- Ability to work independently and take ownership.
- Attention to detail and focus on code quality.
- Comfortable working with ambiguous requirements.
Ideal Candidate Profile (Summary)
A senior-level full stack engineer who has built complex web applications, understands data-heavy systems, and enjoys creating analytical products with a strong focus on performance, scalability, and user experience.
Employment Type: Full-time, Permanent
Location: Near Bommasandra Metro Station, Bangalore (Work from Office – 5 days/week)
Notice Period: 15 days or less preferred
About the Company:
SimStar Asia Ltd is a joint vendor of the SimGems and StarGems Group — a Hong Kong–based multinational organization engaged in the global business of conflict-free, high-value diamonds.
SimStar maintains the highest standards of integrity. Any candidate found engaging in unfair practices at any stage of the interview process will be disqualified and blacklisted.
Experience Required
- 4+ years of relevant professional experience.
Key Responsibilities
- Hands-on backend development using Python (mandatory).
- Write optimized and complex SQL queries; perform query tuning and performance optimization.
- Work extensively with the Odoo framework, including development and deployment.
- Manage deployments using Docker and/or Kubernetes.
- Develop frontend components using OWL.js or any modern JavaScript framework.
- Design scalable systems with a strong foundation in Data Structures, Algorithms, and System Design.
- Handle API integrations and data exchange between systems.
- Participate in technical discussions and architecture decisions.
Interview Expectations
- Candidates must be comfortable writing live code during interviews.
- SQL queries and optimization scenarios will be part of the technical assessment.
Must-Have Skills
- Python backend development
- Advanced SQL
- Odoo Framework & Deployment
- Docker / Kubernetes
- JavaScript frontend (OWL.js preferred)
- System Design fundamentals
- API integration experience
About the role:
We are seeking a highly detail-oriented and experienced Payment Switch Manual Tester to join our Quality Assurance team. The ideal candidate will be responsible for rigorously testing and validating the functionality, reliability, and security of our core payment switch system, ensuring flawless transaction processing and compliance with all industry standards.
Key Responsibilities
- Test Planning & Design: Analyze payment switch requirements, technical specifications, and user stories to create comprehensive test plans, detailed test scenarios, and manual test cases.
- Test Execution: Execute functional, integration, regression, system, and end-to-end testing on the payment switch and related systems
- Transaction Flow Validation: Manually validate various payment transaction lifecycles, including authorization, clearing, settlement, and chargebacks for credit/debit cards, prepaid cards, and other payment methods.
- Defect Management: Identify, document, and track defects and inconsistencies using defect management tools (JIRA) and work closely with development teams to ensure timely solution.
- Protocol & Scheme Testing: Test and validate messages and protocols ( ISO 8583, SWIFT) and ensure compliance with card scheme mandates (Visa, Mastercard, RuPay).
- API Testing: Perform manual testing of APIs (REST/SOAP) related to payment processing, ensuring correct data validation, security, and error handling
- Data Validation: Execute SQL queries for backend database validation to ensure data integrity and consistency across the transaction lifecycle.
- Collaboration: Participate in Agile/Scrum ceremonies, provide testing estimates, and communicate test status and risks to stakeholders.
- Documentation: Prepare and maintain detailed test reports, test summary reports, and other necessary QA documentation.
Required Experience & Skills
- 3+ years of proven experience in manual software testing.
- 3+ years of direct experience testing Payment Switch systems, Card Management Systems.
- Strong understanding of the Payments Domain and the end-to-end transaction lifecycle (Authorization, Clearing, Settlement).
- In-depth knowledge of payment industry standards and protocols, such as ISO 8583.
- Proficiency in designing and executing various types of manual tests (Functional, Regression).
- Unix/Linux – Comfortable with command-line tools, log analysis.
- Experience with testing tools for APIs ( Postman, SoapUI) and defect tracking (JIRA).
- Solid skills in writing and executing SQL queries for data validation.
- Excellent analytical, problem-solving, and communication skills (written and verbal).
Interview Process -
- Screening
- Virtual L1 interview
- Managerial Round
- HR Discussion
About Snabbit: Snabbit is India’s first Quick-Service App, delivering home services in just 15 minutes through a hyperlocal network of trained and verified professionals. Backed by Nexus Venture Partners (investors in Zepto, Unacademy, and Ultrahuman), Snabbit is redefining convenience in home services with quality and speed at its core. Founded by Aayush Agarwal, former Chief of Staff at Zepto, Snabbit is pioneering the Quick-Commerce revolution in services. In a short period, we’ve completed thousands of jobs with unmatched customer satisfaction and are scaling rapidly.
At Snabbit, we don’t just build products—we craft solutions that transform everyday lives. This is a playground for engineers who love solving complex problems, building systems from the ground up, and working in a fast-paced, ownership-driven environment. You’ll work alongside some of the brightest minds, pushing boundaries and creating meaningful impact at scale.
Responsibilities: ● Design, implement, and maintain backend services and APIs
● Develop and architect complex UI features for iOS and Android apps using Flutter
● Write high-quality, efficient, and maintainable code, adhering to industry best practices.
● Participate in design discussions to develop scalable solutions and implement them.
● Take ownership of feature delivery timelines and coordinate with cross-functional teams
● Troubleshoot and debug issues to ensure smooth system operations. ● Design, develop, and own end-to-end features for in-house software and tools
● Optimize application performance and implement best practices for mobile development
● Deploy and maintain services infrastructure on AWS. Requirements: ● Education: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
● Experience: ○ 3-5 years of hands-on experience as a full-stack developer.
○ Expertise in developing backend services and mobile applications.
○ Experience in leading small technical projects or features
○ Proven track record of delivering complex mobile applications to production
● Technical Skills:
○ Strong knowledge of data structures, algorithms, and design patterns. ○ Proficiency in Python and Advanced proficiency in Flutter with deep understanding of widget lifecycle and state management
○ Proficiency in RESTful APIs and microservices architecture ○ Knowledge of mobile app deployment processes and app store guidelines
○ Familiarity with version control systems (Git) and agile development methodologies
○ Experience with AWS or other relevant cloud technologies
○ Experience with databases (SQL, NoSQL) and data modeling
● Soft Skills:
○ Strong problem-solving and debugging abilities with ability to handle complex technical challenges and drive best practices within the team
○ Leadership qualities with ability to mentor and guide junior developers ○ Strong stakeholder management and client communication skills
○ A passion for learning and staying updated with technology trends.
Experience - 10-20 Yrs
Job Location - CommerZone, Yerwada, Pune
Work Mode - Work from Office
Shifts - General Shift
Work days - 5 days
Quantification - Graduation full time mandatory
Domain - Payment/Card/Banking/BFSI/ Retail Payments
Job Type - Full Time
Notice period - Immediate or 30 days
Interview Process -
1) Screening
2) Virtual L1 interview
3) Managerial Round Face to Face at Pune Office
4) HR Discussion
Job Description
Job Summary:
The Production/L2 Application Support Manager will be responsible for managing the banking applications that supports our payment gateway systems in a production environment. You will oversee the deployment, monitoring, optimization, and maintenance of all application components. You will ensure that our systems run smoothly, meet business and regulatory requirements, and provide high availability for our customers.
Key Responsibilities:
- Manage and optimize the application for the payment gateway systems to ensure high availability, reliability, and scalability.
- Oversee the day-to-day operations of production environments, including managing cloud services (AWS), load balancing, database systems, and monitoring tools.
- Lead a team of application support engineers and administrators, providing technical guidance and support to ensure applications and infrastructure solutions are implemented efficiently and effectively.
- Collaborate with development, security, and product teams to ensure application support the needs of the business and complies with relevant regulations.
- Monitor application performance and system health using monitoring tools and ensure quick resolution of any performance bottlenecks or system failures.
- Develop and maintain capacity planning, monitoring, and backup strategies to ensure scalability and minimal downtime during peak transaction periods.
- Drive continuous improvement of processes and tools for efficient production/application management.
- Ensure robust security practices are in place across production systems, including compliance with industry standards
- Conduct incident response, root cause analysis, and post-mortem analysis to prevent recurring issues and improve system performance.
- Oversee regular patching, updates, and version control of production systems to minimize vulnerabilities.
- Develop and maintain application support documentation, including architecture diagrams, processes, and disaster recovery plans.
- Manage and execute on-call duties, ensuring timely resolution of application-related issues and ensuring proper support coverage.
Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
- 8+ years of experience managing L2 application support in high-availability, mission-critical environments, ideally within a payment gateway or fintech organization.
- Experience with working L2 production support base on Java programming.
- Experience with database systems (SQL, NoSQL) and database management, including high availability and disaster recovery strategies.
- Excellent communication and leadership skills, with the ability to collaborate effectively across teams and drive initiatives forward..
- Ability to work well under pressure and in high-stakes situations, ensuring uptime and service continuity.
We are looking for a skilled and motivated Integration Engineer to join our dynamic team in the payment domain. This role involves the seamless integration of payment systems, APIs, and third-party services into our platform, ensuring smooth and secure payment processing. The ideal candidate will bring experience with payment technologies, integration methodologies, and a strong grasp of industry standards.
Key Responsibilities:
- System Integration:
- Design, develop, and maintain integrations between various payment processors, gateways, and internal platforms using RESTful APIs, SOAP, and related technologies.
- Payment Gateway Integration:
- Integrate third-party payment solutions such as Visa, MasterCard, PayPal, Stripe, and others into the platform.
- Troubleshooting & Support:
- Identify and resolve integration issues including transactional failures, connectivity issues, and third-party service disruptions.
- Testing & Validation:
- Conduct end-to-end integration testing to ensure payment system functionality across development, staging, and production environments.
Qualifications:
- Education:
- Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Equivalent work experience is also acceptable.
- Experience:
- 3+ years of hands-on experience in integrating payment systems and third-party services.
- Proven experience with payment gateways (e.g., Stripe, Square, PayPal, Adyen) and protocols (e.g., ISO 20022, EMV).
- Familiarity with payment processing systems and industry standards.
Desirable Skills:
- Strong understanding of API security, OAuth, and tokenization practices.
- Experience with PCI-DSS compliance.
- Excellent problem-solving and debugging skills.
- Effective communication and cross-functional collaboration capabilities.
We are looking for a Python Backend Developer to design, build, and maintain scalable backend services and APIs. The role involves working with modern Python frameworks, databases (SQL and NoSQL), and building well-tested, production-grade systems.
You will collaborate closely with frontend developers, AI/ML engineers, and system architects to deliver reliable and high-performance backend solutions.
Key Responsibilities
- Design, develop, and maintain backend services using Python
- Build and maintain RESTful APIs using FastAPI
- Design efficient data models and queries using MongoDB and SQL databases (PostgreSQL/MySQL)
- Ensure high performance, security, and scalability of backend systems
- Write unit tests, integration tests, and API tests to ensure code reliability
- Debug, troubleshoot, and resolve production issues
- Follow clean code practices, documentation, and version control workflows
- Participate in code reviews and contribute to technical discussions
- Work closely with cross-functional teams to translate requirements into technical solutions
Required Skills & Qualifications
Technical Skills
- Strong proficiency in Python
- Hands-on experience with FastAPI
- Experience with MongoDB (schema design, indexing, aggregation)
- Solid understanding of SQL databases and relational data modelling
- Experience writing and maintaining automated tests
- Unit testing (e.g., pytest)
- API testing
- Understanding of REST API design principles
- Familiarity with Git and collaborative development workflows
Good to Have
- Experience with async programming in Python (async/await)
- Knowledge of ORMs/ODMs (SQLAlchemy, Tortoise, Motor, etc.)
- Basic understanding of authentication & authorisation (JWT, OAuth)
- Exposure to Docker / containerised environments
- Experience working in Agile/Scrum teams
What We Value
- Strong problem-solving and debugging skills
- Attention to detail and commitment to quality
- Ability to write testable, maintainable, and well-documented code
- Ownership mindset and willingness to learn
- Teamwork
What We Offer
- Opportunity to work on real-world, production systems
- Technically challenging problems and ownership of components
- Collaborative engineering culture
Review Criteria
- Strong Data Scientist/Machine Learnings/ AI Engineer Profile
- 2+ years of hands-on experience as a Data Scientist or Machine Learning Engineer building ML models
- Strong expertise in Python with the ability to implement classical ML algorithms including linear regression, logistic regression, decision trees, gradient boosting, etc.
- Hands-on experience in minimum 2+ usecaseds out of recommendation systems, image data, fraud/risk detection, price modelling, propensity models
- Strong exposure to NLP, including text generation or text classification (Text G), embeddings, similarity models, user profiling, and feature extraction from unstructured text
- Experience productionizing ML models through APIs/CI/CD/Docker and working on AWS or GCP environments
- Preferred (Company) – Must be from product companies
Job Specific Criteria
- CV Attachment is mandatory
- What's your current company?
- Which use cases you have hands on experience?
- Are you ok for Mumbai location (if candidate is from outside Mumbai)?
- Reason for change (if candidate has been in current company for less than 1 year)?
- Reason for hike (if greater than 25%)?
Role & Responsibilities
- Partner with Product to spot high-leverage ML opportunities tied to business metrics.
- Wrangle large structured and unstructured datasets; build reliable features and data contracts.
- Build and ship models to:
- Enhance customer experiences and personalization
- Boost revenue via pricing/discount optimization
- Power user-to-user discovery and ranking (matchmaking at scale)
- Detect and block fraud/risk in real time
- Score conversion/churn/acceptance propensity for targeted actions
- Collaborate with Engineering to productionize via APIs/CI/CD/Docker on AWS.
- Design and run A/B tests with guardrails.
- Build monitoring for model/data drift and business KPIs
Ideal Candidate
- 2–5 years of DS/ML experience in consumer internet / B2C products, with 7–8 models shipped to production end-to-end.
- Proven, hands-on success in at least two (preferably 3–4) of the following:
- Recommender systems (retrieval + ranking, NDCG/Recall, online lift; bandits a plus)
- Fraud/risk detection (severe class imbalance, PR-AUC)
- Pricing models (elasticity, demand curves, margin vs. win-rate trade-offs, guardrails/simulation)
- Propensity models (payment/churn)
- Programming: strong Python and SQL; solid git, Docker, CI/CD.
- Cloud and data: experience with AWS or GCP; familiarity with warehouses/dashboards (Redshift/BigQuery, Looker/Tableau).
- ML breadth: recommender systems, NLP or user profiling, anomaly detection.
- Communication: clear storytelling with data; can align stakeholders and drive decisions.
Review Criteria:
- Strong Software Engineer fullstack profile using NodeJS / Python and React
- 6+ YOE in Software Development using Python OR NodeJS (For backend) & React (For frontend)
- Must have strong experience in working on Typescript
- Must have experience in message-based systems like Kafka, RabbitMq, Redis
- Databases - PostgreSQL & NoSQL databases like MongoDB
- Product Companies Only
- Tier 1 Engineering Institutes preferred (IIT, NIT, BITS, IIIT, DTU or equivalent)
Preferred:
- Experience in Fin-Tech, Payment, POS and Retail products is highly preferred
- Experience in mentoring, coaching the team.
Role & Responsibilities:
We are currently seeking a Senior Engineer to join our Financial Services team, contributing to the design and development of scalable system.
The Ideal Candidate Will Be Able To-
- Take ownership of delivering performant, scalable and high-quality cloud-based software, both frontend and backend side.
- Mentor team members to develop in line with product requirements.
- Collaborate with Senior Architect for design and technology choices for product development roadmap.
- Do code reviews.
Ideal Candidate:
- Thorough knowledge of developing cloud-based software including backend APIs and react based frontend.
- Thorough knowledge of scalable design patterns and message-based systems such as Kafka, RabbitMq, Redis, MongoDB, ORM, SQL etc.
- Experience with AWS services such as S3, IAM, Lambda etc.
- Expert level coding skills in Python FastAPI/Django, NodeJs, TypeScript, ReactJs.
- Eye for user responsive designs on the frontend.
We are seeking an experienced and highly skilled Java (Fullstack) Engineer to join our team.
The ideal candidate will have a strong background in both Back-end JAVA, Spring-boot, Spring Framework & Frontend Javascript, React or Angular with ability to salable high performance applications.
Responsibilities
- Develop, test, and deploy scalable and robust backend services Develop, test & deploy scalable & robust back-end services using JAVA & Spring-boot
- Build responsive & user friendly front-end applications using modern Java-script framework with React
- or Angular
- Collaborate with architects & team members to design salable, maintainable & efficient systems.
- Contribute to architectural decisions for micro-services, API’s & cloud solutions.
- Implement & maintain Restful API for seamless integration.
- Write clean, efficient & res-usable code adhering to best practices
- Conduct code reviews, performance optimizations & debugging
- Work with cross functional teams, including UX/UI designers, product managers & QA team.
- Mentor junior developers & provide technical guidance.
Skills & Requirements
- Minimum 3 Years of experience in backend/ fullstack development
- Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
- Front-end - JavaScript, HTML, CSS, Typescript, Angular
- Database - MySQL
Preferred
- Experience with Batch writing, Application performance, Caches security, Web Security
- Experience working in fintech, payments, or high-scale production environments





















