50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
Responsibilities:
- Design, develop, and maintain efficient and reliable data pipelines.
- Identify and implement process improvements, automating manual tasks and optimizing data delivery.
- Build and maintain the infrastructure for data extraction, transformation, and loading (ETL) from diverse sources using SQL and AWS cloud technologies.
- Develop data tools and solutions to empower our analytics and data science teams, contributing to product innovation.
Qualifications:
Must Have:
- 2-3 years of experience in a Data Engineering role.
- Familiarity with data pipeline and workflow management tools (e.g., Airflow, Luigi, Azkaban).
- Experience with AWS cloud services.
- Working knowledge of object-oriented/functional scripting in Python
- Experience building and optimizing data pipelines and datasets.
- Strong analytical skills and experience working with structured and unstructured data.
- Understanding of data transformation, data structures, dimensional modeling, metadata management, schema evolution, and workload management.
- A passion for building high-quality, scalable data solutions.
Good to have:
- Experience with stream-processing systems (e.g., Spark Streaming, Flink).
- Working knowledge of message queuing, stream processing, and scalable data stores.
- Proficiency in SQL and experience with NoSQL databases like Elasticsearch and Cassandra/MongoDB.
Experience with big data tools such as HDFS/S3, Spark/Flink, Hive, HBase, Kafka/Kinesis.
Position: .Net Core Intern (.Net Core Knowledge is must)
Education: BTech-Computer Science Only
Joining: Immediate Joiner
Work Mode: Remote
Working Days: Monday to Friday
Shift: Rotational – based on project need):
· 5:00 PM – 2:00 AM IST
· 6:00 PM – 3:00 AM IST
Job Summary
ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.
Eligibility & Qualifications
- Education:
- B.Tech (Computer Science) / M.Tech (Computer Science)
- Tier 1 colleges preferred
- Experience Level: Fresher
- Communication: Excellent English communication skills (verbal & written)
Skills Required
1. Technical Skills (Must Have)
- Experience with .NET Core (.NET 6 / 7 / 8)
- Strong knowledge of C#, including:
- Object-Oriented Programming (OOP) concepts
- async/await
- LINQ
- ASP.NET Core (Web API / MVC)
2. Database Skills
- SQL Server (preferred)
- Writing complex SQL queries, joins, and subqueries
- Stored Procedures, Functions, and Indexes
- Database design and performance tuning
- Entity Framework Core
- Migrations and transaction handling
3. Frontend Skills (Required)
- JavaScript (ES5 / ES6+)
- jQuery
- DOM manipulation
- AJAX calls
- Event handling
- HTML5 & CSS3
- Client-side form validation
4. Security & Performance
- Data validation and exception handling
- Caching concepts (In-memory / Redis – good to have)
5. Tools & Environment
- Visual Studio / VS Code
- Git (GitHub / Azure DevOps)
- Basic knowledge of server deployment
6. Good to Have (Optional)
- Azure or AWS deployment experience
- CI/CD pipelines
- Docker
- Experience with data handling
Work Environment & Tools
- Comfortable working in a remote setup
- Familiarity with collaboration and remote access tools
Additional Requirements (Work-from-Home Setup)
This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:
- System: Laptop or Desktop (Windows-based)
- Operating System: Windows
- Screen Size: Minimum 14 inches
- Screen Resolution: Full HD (1920 × 1080)
- Processor: Intel i5 or higher
- RAM: Minimum 8 GB (Mandatory)
- Software: AnyDesk
- Internet Speed: 100 Mbps or higher
About ARDEM
ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.
Job Description -
Profile: .Net Full Stack Lead
Experience Required: 7–12 Years
Location: Pune, Bangalore, Chennai, Coimbatore, Delhi, Hosur, Hyderabad, Kochi, Kolkata, Trivandrum
Work Mode: Hybrid
Shift: Normal Shift
Key Responsibilities:
- Design, develop, and deploy scalable microservices using .NET Core and C#
- Build and maintain serverless applications using AWS services (Lambda, SQS, SNS)
- Develop RESTful APIs and integrate them with front-end applications
- Work with both SQL and NoSQL databases to optimize data storage and retrieval
- Implement Entity Framework for efficient database operations and ORM
- Lead technical discussions and provide architectural guidance to the team
- Write clean, maintainable, and testable code following best practices
- Collaborate with cross-functional teams to deliver high-quality solutions
- Participate in code reviews and mentor junior developers
- Troubleshoot and resolve production issues in a timely manner
Required Skills & Qualifications:
- 7–12 years of hands-on experience in .NET development
- Strong proficiency in .NET Framework, .NET Core, and C#
- Proven expertise with AWS services (Lambda, SQS, SNS)
- Solid understanding of SQL and NoSQL databases (SQL Server, MongoDB, DynamoDB, etc.)
- Experience building and deploying Microservices architecture
- Proficiency in Entity Framework or EF Core
- Strong knowledge of RESTful API design and development
- Experience with React or Angular is a good to have
- Understanding of CI/CD pipelines and DevOps practices
- Strong debugging, performance optimization, and problem-solving skills
- Experience with design patterns, SOLID principles, and best coding practices
- Excellent communication and team leadership skills
• Minimum 4+ years of years
• Experience in designing, developing, and maintain backend services using C# 12 and .NET 8 or .NET 9
• Experience in building and operating cloud native and serverless applications on AWS
• Experience in developing and integrating services using AWS lambda, API Gateway , dynamo DB, Eventbridge, CloudWatch, SQS, SNS, Kinesis, Secret Manager, S3 storage, server architectural models etc.
Experience in integrating services using AWS SDK
• Should be cognizant of the OMS paradigms including Inventory Management, Inventory publish, supply feed processing, control mechanisms, ATP publish, Order Orchestration, workflow set up and customizations, integrations with tax, AVS, payment engines, sourcing algorithms and managing reservations with back orders, schedule mechanisms, flash sales management etc.
• Should have a decent End to End knowledge of various Commerce subsystems which include Storefront, Core Commerce back end, Post Purchase processing, OMS, Store / Warehouse Management processes, Supply Chain and Logistic processes. This is to ascertain candidates knowhow on the overall Retail landscape of any customer.
• Strong knowledge in Querying in Oracle DB and SQL Server
• Able to read, write and manage PLSQL procedures in oracle.
• Strong debugging, performance tuning and problem solving skills
• Experience with event driven and micro services architectures
Job Details
- Job Title: Staff Engineer
- Industry: Technology
- Domain - Information technology (IT)
- Experience Required: 9-12 years
- Employment Type: Full Time
- Job Location: Bengaluru
- CTC Range: Best in Industry
Role & Responsibilities
As a Staff Engineer at company, you will play a critical role in defining and driving our backend architecture as we scale globally. You’ll own key systems that handle high volumes of data and transactions, ensuring performance, reliability, and maintainability across distributed environments.
Key Responsibilities-
- Own one or more core applications end-to-end, ensuring reliability, performance, and scalability.
- Lead the design, architecture, and development of complex, distributed systems, frameworks, and libraries aligned with company’s technical strategy.
- Drive engineering operational excellence by defining robust roadmaps for system reliability, observability, and performance improvements.
- Analyze and optimize existing systems for latency, throughput, and efficiency, ensuring they perform at scale.
- Collaborate cross-functionally with Product, Data, and Infrastructure teams to translate business requirements into technical deliverables.
- Mentor and guide engineers, fostering a culture of technical excellence, ownership, and continuous learning.
- Establish and uphold coding standards, conduct design and code reviews, and promote best practices across teams.
- Stay ahead of the curve on emerging technologies, frameworks, and patterns to strengthen company’s technology foundation.
- Contribute to hiring by identifying and attracting top-tier engineering talent.
Ideal Candidate
- Strong staff engineer profile
- Must have 9+ years in backend engineering with Java, Spring/Spring Boot, and microservices building large and schalable systems
- Must have been SDE-3 / Tech Lead / Lead SE for at least 2.5 years
- Strong in DSA, system design, design patterns, and problem-solving
- Proven experience building scalable, reliable, high-performance distributed systems
- Hands-on with SQL/NoSQL databases, REST/gRPC APIs, concurrency & async processing
- Experience in AWS/GCP, CI/CD pipelines, and observability/monitoring
- Excellent ability to explain complex technical concepts to varied stakeholders
- Product companies (B2B SAAS preferred)
- Must have stayed for at least 2 years with each of the previous companies
- (Education): B.Tech in computer science from Tier 1, Tier 2 colleges
Job Title : System Support Engineer – L1
Experience : 2.5+ Years
Location : Mumbai (Powai)
Shift : Rotational
Role Summary :
Provide first-level technical and functional support for enterprise applications and infrastructure. Handle user issues, troubleshoot systems, and ensure timely resolution while following support processes.
Key Responsibilities :
- Provide phone/email support and own user issues end-to-end.
- Log, track, and update tickets in Jira/Freshdesk.
- Troubleshoot Linux/UNIX systems, web servers, and databases.
- Escalate unresolved issues and communicate during downtimes.
- Create knowledge base articles and support documentation.
Mandatory Skills :
Linux/UNIX administration, Apache/Tomcat/JBoss, basic SQL databases (MySQL/SQL Server/Oracle), scripting knowledge, and ticketing tools experience.
Preferred :
- Banking/Financial Services domain exposure and client-site support experience.
- Strong communication skills, customer-focused mindset, and willingness to work in rotational shifts are essential.
Job Details
- Job Title: Lead I - Software Engineering - Java, Spring Boot, Microservices
- Industry: Global digital transformation solutions provider
- Domain - Information technology (IT)
- Experience Required: 5-7 years
- Employment Type: Full Time
- Job Location: Trivandrum, Chennai, Kochi, Thiruvananthapuram
- CTC Range: Best in Industry
Job Description
Job Title: Senior Java Developer Experience: 5+ years
Job Summary:
We are looking for a Senior Java Developer with strong experience in Spring Boot and Microservices to work on high-performance applications for a leading financial services client. The ideal candidate will have deep expertise in Java backend development, cloud (preferably GCP), and strong problem-solving abilities.
Key Responsibilities:
• Develop and maintain Java-based microservices using Spring Boot
• Collaborate with Product Owners and teams to gather and review requirements
• Participate in design reviews, code reviews, and unit testing
• Ensure application performance, scalability, and security
• Contribute to solution architecture and design documentation
• Support Agile development processes including daily stand-ups and sprint planning
• Mentor junior developers and lead small modules or features
Required Skills:
• Java, Spring Boot, Microservices architecture
• GCP (or other cloud platforms like AWS)
• REST/SOAP APIs, Hibernate, SQL, Tomcat
• CI/CD tools: Jenkins, Bitbucket
• Agile methodologies (Scrum/Kanban)
• Unit testing (JUnit), debugging and troubleshooting
• Good communication and team leadership skills
Preferred Skills:
• Frontend familiarity (Angular, AJAX)
• Experience with API documentation tools (Swagger)
• Understanding of design patterns and UML
• Exposure to Confluence, Jira
Mandatory Skills Required:
Strong proficiency in Java, spring boot, Microservices, GCP/AWS.
Experience Required: Minimum 5+ years of relevant experience
Java/J2EE (5+ years), Spring/Spring Boot (5+ years), Microservices (5+ years), AWS/GCP/Azure (mandatory), CI/CD (Jenkins, SonarQube, Git)
Java, Spring Boot, Microservices architecture
GCP (or other cloud platforms like AWS)
REST/SOAP APIs, Hibernate, SQL, Tomcat
CI/CD tools: Jenkins, Bitbucket
Agile methodologies (Scrum/Kanban)
Unit testing (JUnit), debugging and troubleshooting
Good communication and team leadership skills
******
Notice period - 0 to 15 days only (Immediate and who can join by Feb)
Job stability is mandatory
Location: Trivandrum, Kochi, Chennai
Virtual Interview - 14th Feb 2026
We are looking for a Staff Engineer - PHP to join one of our engineering teams at our office in Hyderabad.
What would you do?
- Design, build, and maintain backend systems and APIs from requirements to production.
- Own feature development, bug fixes, and performance optimizations.
- Ensure code quality, security, testing, and production readiness.
- Collaborate with frontend, product, and QA teams for smooth delivery.
- Diagnose and resolve production issues and drive long-term fixes.
- Contribute to technical discussions and continuously improve engineering practices.
Who Should Apply?
- 4–6 years of hands-on experience in backend development using PHP.
- Strong proficiency with Laravel or similar PHP frameworks, following OOP, MVC, and design patterns.
- Solid experience in RESTful API development and third-party integrations.
- Strong understanding of SQL databases (MySQL/PostgreSQL); NoSQL exposure is a plus.
- Comfortable with Git-based workflows and collaborative development.
- Working knowledge of HTML, CSS, and JavaScript fundamentals.
- Experience with performance optimization, security best practices, and debugging.
- Nice to have: exposure to Docker, CI/CD pipelines, cloud platforms, and automated testing.
Job Description: Data Analyst
About the Role
We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..
Key Responsibilities
- Data Extraction & Management
- Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.
- Ensure accuracy, reliability, and consistency of data across different platforms.
- Data Analysis & Insights
- Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.
- Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.
- Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.
- Business Intelligence & Visualization
- Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).
- Create visualizations that simplify complex datasets for stakeholders and management.
- Python (Pandas)
- Use Python (Pandas, NumPy) for advanced analytics.
- Collaboration & Stakeholder Management
- Work closely with product, operations, and leadership teams to provide insights that drive decision-making.
- Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.
Required Skills
- SQL/PostgreSQL
- Complex joins, window functions, CTEs, aggregations, query optimization.
- Python (Pandas & Analytics)
- Data wrangling, cleaning, transformations, exploratory data analysis (EDA).
- Libraries: Pandas, NumPy, Matplotlib, Seaborn
- Data Visualization & BI Tools
- Expertise in creating dashboards and reports using Metabase or Looker.
- Ability to translate raw data into meaningful visual insights.
- Business Intelligence
- Strong analytical reasoning to connect data insights with e-commerce KPIs.
- Experience in funnel analysis, customer journey mapping, and retention analysis.
- Analytics & E-commerce Knowledge
- Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.
- General Skills
- Strong communication and presentation skills.
- Ability to work cross-functionally in fast-paced environments.
- Problem-solving mindset with attention to detail.
Education: Bachelor’s degree in Data Science, Computer Science, data processing
JOB DETAILS:
* Job Title: DevOps Engineer (Azure)
* Industry: Technology
* Salary: Best in Industry
* Experience: 2-5 years
* Location: Bengaluru, Koramangala
Review Criteria
- Strong Azure DevOps Engineer Profiles.
- Must have minimum 2+ years of hands-on experience as an Azure DevOps Engineer with strong exposure to Azure DevOps Services (Repos, Pipelines, Boards, Artifacts).
- Must have strong experience in designing and maintaining YAML-based CI/CD pipelines, including end-to-end automation of build, test, and deployment workflows.
- Must have hands-on scripting and automation experience using Bash, Python, and/or PowerShell
- Must have working knowledge of databases such as Microsoft SQL Server, PostgreSQL, or Oracle Database
- Must have experience with monitoring, alerting, and incident management using tools like Grafana, Prometheus, Datadog, or CloudWatch, including troubleshooting and root cause analysis
Preferred
- Knowledge of containerisation and orchestration tools such as Docker and Kubernetes.
- Knowledge of Infrastructure as Code and configuration management tools such as Terraform and Ansible.
- Preferred (Education) – BE/BTech / ME/MTech in Computer Science or related discipline
Role & Responsibilities
- Build and maintain Azure DevOps YAML-based CI/CD pipelines for build, test, and deployments.
- Manage Azure DevOps Repos, Pipelines, Boards, and Artifacts.
- Implement Git branching strategies and automate release workflows.
- Develop scripts using Bash, Python, or PowerShell for DevOps automation.
- Monitor systems using Grafana, Prometheus, Datadog, or CloudWatch and handle incidents.
- Collaborate with dev and QA teams in an Agile/Scrum environment.
- Maintain documentation, runbooks, and participate in root cause analysis.
Ideal Candidate
- 2–5 years of experience as an Azure DevOps Engineer.
- Strong hands-on experience with Azure DevOps CI/CD (YAML) and Git.
- Experience with Microsoft Azure (OCI/AWS exposure is a plus).
- Working knowledge of SQL Server, PostgreSQL, or Oracle.
- Good scripting, troubleshooting, and communication skills.
- Bonus: Docker, Kubernetes, Terraform, Ansible experience.
- Comfortable with WFO (Koramangala, Bangalore).
Application Architect – .NET
Role Overview
We are looking for a senior, hands-on Application Architect with deep .NET experience who can fix and modernize our current systems and build a strong engineering team over time.
Important – This role hands-on with architectural mindset. This person should be comfortable working with legacy systems and can make and explain tradeoffs.
Key Responsibilities
Application Architecture & Modernization
- Own application architecture across legacy .NET Framework and modern .NET systems
- Review the existing application, and drive an incremental modernization approach along with new feature development as per business growth of the company.
- Own the gradual move away from outdated patterns (Web Forms, tightly coupled MVC, legacy UI constructs)
- Define clean API contracts between front-end and backend services
- Identify and resolve performance bottlenecks across code and database layers
- Improve data access patterns, caching strategies, and system responsiveness
- Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.
Backend, APIs & Integrations
- Design scalable backend services and APIs
- Improve how newer .NET services interact with legacy systems
- Lead integrations with external systems, including Zoho
- Prior experience integrating with Zoho (CRM, Finance, or other modules) is a strong value add
- Experience designing and implementing integrations using EDI standards
Data & Schema Design
- Review existing database schemas and core data structures
- Redesign data models to support growth, and reporting/analytics requirements
- Optimize SǪL queries to reduce the load on execution and DB engine
Cloud Awareness
- Design applications with cloud deployment in mind (primarily Azure)
- Understand how to use Azure services to improve security, scalability, and availability
- Work with Cloud and DevOps teams to ensure application architecture aligns with cloud best practices
- Push for CI/CD automation so that team pushes code regularly and makes progress.
Team Leadership & Best Practices
- Act as a technical leader and mentor for the engineering team
- Help hire, onboard, and grow a team under this role over time.
- Define KPIs and engineering best practices (including focus on documentation)
- Set coding standards, architectural guidelines, and review practices
- Improve testability and long-term health of the codebase
- Raise the overall engineering bar through reviews, coaching, and clear standards
- Create a culture of ownership and quality
Cross-Platform Thinking
- Strong communicator who can convert complex tech topics into business-friendly lingo. Understands the business needs and importance of user experience
- While .NET is the core stack, contribute to architecture decisions across platforms
- Leverages AI tools to accelerate design, coding, reviews, and troubleshooting while maintaining high quality
Skills and Experience
- 12+ years of hands-on experience in application development (preferably on .NET stack)
- Experience leading technical direction while remaining hands-on
- Deep expertise in .NET Framework (4.x) and modern .NET (.NET Core / .NET 6+)
- Must have lead a project to modernize legacy system – preferably moving from .NET Framework to .NET Core.
- Experience with MVC, Web Forms, and legacy UI patterns
- Solid backend and API design experience
- Strong understanding of database design and schema evolution
- Understanding of Analytical systems – OLAP, Data warehousing, data lakes.
- Strong proponent of AI and has extensively used AI tools such as Github Copilot, Cursor, Windsurf, Codex, etc.
- Integration with Zoho would be a plus.
About Cansvolution
Cansvolution is a growing IT services and product-based company based in Indore, M.P. We work with clients across industries, delivering scalable web and digital solutions. Our team focuses on innovation, practical problem-solving, and building technology that creates real business impact. We offer a collaborative work culture, hands-on learning, and strong growth opportunities for our employees.
Position: .NET Developer
Experience Required: Minimum 2+ Years
Location: Indore (Work From Office)
Joining: Immediate joiners preferred
Key Responsibilities
Design, develop, and maintain web applications using .NET technologies
Work on front-end development using React JS or Angular
Build and consume RESTful APIs
Collaborate with cross-functional teams including designers and backend developers
Debug, troubleshoot, and improve application performance
Participate in code reviews and follow best development practices
Required Skills
Strong experience in ASP.NET / .NET Core
Hands-on expertise in React JS or Angular
Good understanding of HTML, CSS, JavaScript
Experience with SQL databases
Knowledge of API integration
Understanding of software development lifecycle.
Preferred Skills
Experience working in Agile environments
Knowledge of version control tools like Git
Strong analytical and problem-solving abilities
Job Title : QA Lead (AI/ML Products)
Employment Type : Full Time
Experience : 4 to 8 Years
Location : On-site
Mandatory Skills : Strong hands-on experience in testing AI/ML (LLM, RAG) applications with deep expertise in API testing, SQL/NoSQL database validation, and advanced backend functional testing.
Role Overview :
We are looking for an experienced QA Lead who can own end-to-end quality for AI-influenced products and backend-heavy systems. This role requires strong expertise in advanced functional testing, API validation, database verification, and AI model behavior testing in non-deterministic environments.
Key Responsibilities :
- Define and implement comprehensive test strategies aligned with business and regulatory goals.
- Validate AI/ML and LLM-driven applications, including RAG pipelines, hallucination checks, prompt injection scenarios, and model response validation.
- Perform deep API testing using Postman/cURL and validate JSON/XML payloads.
- Execute complex SQL queries (MySQL/PostgreSQL) and work with MongoDB for backend and data integrity validation.
- Analyze server logs and transactional flows to debug issues and ensure system reliability.
- Conduct risk analysis and report key QA metrics such as defect leakage and release readiness.
- Establish and refine QA processes, templates, standards, and agile testing practices.
- Identify performance bottlenecks and basic security vulnerabilities (e.g., IDOR, data exposure).
- Collaborate closely with developers, product managers, and domain experts to translate business requirements into testable scenarios.
- Own feature quality independently from conception to release.
Required Skills & Experience :
- 4+ years of hands-on experience in software testing and QA.
- Strong understanding of testing AI/ML products, LLM validation, and non-deterministic behavior testing.
- Expertise in API Testing, server log analysis, and backend validation.
- Proficiency in SQL (MySQL/PostgreSQL) and MongoDB.
- Deep knowledge of SDLC and Bug Life Cycle.
- Strong problem-solving ability and structured approach to ambiguous scenarios.
- Awareness of performance testing and basic security testing practices.
- Excellent communication skills to articulate defects and QA strategies.
What We’re Looking For :
A proactive QA professional who can go beyond UI testing, understands backend systems deeply, and can confidently test modern AI-driven applications while driving quality standards across the team.
AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.
Location : Bangalore, Hyderabad, Mumbai, and Gurgaon
Responsibilities:
· Designing, building, and operating scalable on-premises or cloud data architecture
· Analyzing business requirements and translating them into technical specifications
· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)
· Design, develop, and maintain scalable data pipelines and ETL processes
· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness
· Implement data governance and security best practices to ensure compliance and data integrity
· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring
· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
Requirements
· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines
· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks
· Comprehensive understanding of the Snowflake and Databricks ecosystem
· Strong programming skills in languages like SQL and Python or PySpark.
· Experience with data modeling, ETL processes, and data warehousing concepts.
· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.
Review Criteria
- Strong Senior Backend Engineer profiles
- Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
- Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
- Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
- Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
- Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
- Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
- (Company) – Must have worked in product companies / startups, preferably Series A to Series D
- (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred
Role & Responsibilities
As a Founding Engineer at company, you'll join our engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.
This role is perfect for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems need creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.
Key Responsibilities-
- Build core platform features: Develop robust APIs, services, and integrations that power company’s billing automation and revenue recognition capabilities
- Work across the full stack: Contribute to both backend services and frontend interfaces, ensuring seamless user experiences
- Implement critical integrations: Connect company with external systems including CRMs, data warehouses, ERPs, and payment processors
- Optimize for scale: Build systems that handle complex pricing models, high-volume usage data, and real-time financial calculations
- Drive quality and best practices: Write clean, maintainable code while participating in code reviews and architectural discussions
- Solve complex problems: Debug issues across the stack and work closely with teams to address evolving client needs
The Impact You'll Make-
- Power business growth: Your code will directly enable billing and revenue operations for fast-growing B2B companies, helping them scale without operational bottlenecks
- Build critical financial infrastructure: Contribute to systems handling millions in transactions while ensuring accurate, compliant revenue recognition
- Shape product direction: Join during our scaling phase where your contributions immediately impact product evolution and customer success
- Accelerate your expertise: Gain deep knowledge in financial systems, B2B SaaS operations, and enterprise software while working with industry veterans
- Drive the future of B2B commerce: Help create infrastructure powering next-generation pricing models from usage-based to value-based billing.
JOB DETAILS:
* Job Title: Lead I - Software Engineering-Kotlin, Java, Spring Boot, Aws
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 5 -7 years
* Location: Trivandrum, Thiruvananthapuram
Role Proficiency:
Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities
Skill Examples:
- Explain and communicate the design / development to the customer
- Perform and evaluate test results against product specifications
- Break down complex problems into logical components
- Develop user interfaces business software components
- Use data models
- Estimate time and effort required for developing / debugging features / components
- Perform and evaluate test in the customer or target environment
- Make quick decisions on technical/project related challenges
- Manage a Team mentor and handle people related issues in team
- Maintain high motivation levels and positive dynamics in the team.
- Interface with other teams’ designers and other parallel practices
- Set goals for self and team. Provide feedback to team members
- Create and articulate impactful technical presentations
- Follow high level of business etiquette in emails and other business communication
- Drive conference calls with customers addressing customer questions
- Proactively ask for and offer help
- Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks.
- Build confidence with customers by meeting the deliverables on time with quality.
- Estimate time and effort resources required for developing / debugging features / components
- Make on appropriate utilization of Software / Hardware’s.
- Strong analytical and problem-solving abilities
Knowledge Examples:
- Appropriate software programs / modules
- Functional and technical designing
- Programming languages – proficient in multiple skill clusters
- DBMS
- Operating Systems and software platforms
- Software Development Life Cycle
- Agile – Scrum or Kanban Methods
- Integrated development environment (IDE)
- Rapid application development (RAD)
- Modelling technology and languages
- Interface definition languages (IDL)
- Knowledge of customer domain and deep understanding of sub domain where problem is solved
Additional Comments:
We are seeking an experienced Senior Backend Engineer with strong expertise in Kotlin and Java to join our dynamic engineering team.
The ideal candidate will have a deep understanding of backend frameworks, cloud technologies, and scalable microservices architectures, with a passion for clean code, resilience, and system observability.
You will play a critical role in designing, developing, and maintaining core backend services that power our high-availability e-commerce and promotion platforms.
Key Responsibilities
Design, develop, and maintain backend services using Kotlin (JVM, Coroutines, Serialization) and Java.
Build robust microservices with Spring Boot and related Spring ecosystem components (Spring Cloud, Spring Security, Spring Kafka, Spring Data).
Implement efficient serialization/deserialization using Jackson and Kotlin Serialization. Develop, maintain, and execute automated tests using JUnit 5, Mockk, and ArchUnit to ensure code quality.
Work with Kafka Streams (Avro), Oracle SQL (JDBC, JPA), DynamoDB, and Redis for data storage and caching needs. Deploy and manage services in AWS environment leveraging DynamoDB, Lambdas, and IAM.
Implement CI/CD pipelines with GitLab CI to automate build, test, and deployment processes.
Containerize applications using Docker and integrate monitoring using Datadog for tracing, metrics, and dashboards.
Define and maintain infrastructure as code using Terraform for services including GitLab, Datadog, Kafka, and Optimizely.
Develop and maintain RESTful APIs with OpenAPI (Swagger) and JSON API standards.
Apply resilience patterns using Resilience4j to build fault-tolerant systems.
Adhere to architectural and design principles such as Domain-Driven Design (DDD), Object-Oriented Programming (OOP), and Contract Testing (Pact).
Collaborate with cross-functional teams in an Agile Scrum environment to deliver high-quality features.
Utilize feature flagging tools like Optimizely to enable controlled rollouts.
Mandatory Skills & Technologies Languages:
Kotlin (JVM, Coroutines, Serialization),
Java Frameworks: Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data)
Serialization: Jackson, Kotlin Serialization
Testing: JUnit 5, Mockk, ArchUnit
Data: Kafka (Avro) Streams Oracle SQL (JDBC, JPA) DynamoDB (NoSQL) Redis (Caching)
Cloud: AWS (DynamoDB, Lambda, IAM)
CI/CD: GitLab CI Containers: Docker
Monitoring & Observability: Datadog (Tracing, Metrics, Dashboards, Monitors)
Infrastructure as Code: Terraform (GitLab, Datadog, Kafka, Optimizely)
API: OpenAPI (Swagger), REST API, JSON API
Resilience: Resilience4j
Architecture & Practices: Domain-Driven Design (DDD) Object-Oriented Programming (OOP) Contract Testing (Pact) Feature Flags (Optimizely)
Platforms: E-Commerce Platform (CommerceTools), Promotion Engine (Talon.One)
Methodologies: Scrum, Agile
Skills: Kotlin, Java, Spring Boot, Aws
Must-Haves
Kotlin (JVM, Coroutines, Serialization), Java, Spring Boot (Spring Cloud, Spring Security, Spring Kafka, Spring Data), AWS (DynamoDB, Lambda, IAM), Microservices Architecture
******
Notice period - 0 to 15 days only
Job stability is mandatory
Location: Trivandrum
Virtual Weekend Interview on 7th Feb 2026 - Saturday

🚀 Hiring: Associate Tech Architect / Senior Tech Specialist
🌍 Remote | Contract Opportunity
We’re looking for a seasoned tech professional who can lead the design and implementation of cloud-native data and platform solutions. This is a remote, contract-based role for someone with strong ownership and architecture experience.
🔴 Mandatory & Most Important Skill Set
Hands-on expertise in the following technologies is essential:
✅ AWS – Cloud architecture & services
✅ Python – Backend & data engineering
✅ Terraform – Infrastructure as Code
✅ Airflow – Workflow orchestration
✅ SQL – Data processing & querying
✅ DBT – Data transformation & modeling
💼 Key Responsibilities
- Architect and build scalable AWS-based data platforms
- Design and manage ETL/ELT pipelines
- Orchestrate workflows using Airflow
- Implement cloud infrastructure using Terraform
- Lead best practices in data architecture, performance, and scalability
- Collaborate with engineering teams and provide technical leadership
🎯 Ideal Profile
✔ Strong experience in cloud and data platform architecture
✔ Ability to take end-to-end technical ownership
✔ Comfortable working in a remote, distributed team environment
📄 Role Type: Contract
🌍 Work Mode: 100% Remote
If you have deep expertise in these core technologies and are ready to take on a high-impact architecture role, we’d love to hear from you.
JOB DETAILS:
* Job Title: Associate III - Azure Data Engineer
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4 -6 years
* Location: Trivandrum, Kochi
Job Description: Azure Data Engineer (4–6 Years Experience)
Job Type: Full-time
Locations: Kochi, Trivandrum
Must-Have Skills
Azure & Data Engineering
- Azure Data Factory (ADF)
- Azure Databricks (PySpark)
- Azure Synapse Analytics
- Azure Data Lake Storage Gen2
- Azure SQL Database
Programming & Querying
- Python (PySpark)
- SQL / Spark SQL
Data Modelling
- Star & Snowflake schema
- Dimensional modelling
Source Systems
- SQL Server
- Oracle
- SAP
- REST APIs
- Flat files (CSV, JSON, XML)
CI/CD & Version Control
- Git
- Azure DevOps / GitHub Actions
Monitoring & Scheduling
- ADF triggers
- Databricks jobs
- Log Analytics
Security
- Managed Identity
- Azure Key Vault
- Azure RBAC / Access Control
Soft Skills
- Strong analytical & problem-solving skills
- Good communication and collaboration
- Ability to work in Agile/Scrum environments
- Self-driven and proactive
Good-to-Have Skills
- Power BI basics
- Delta Live Tables
- Synapse Pipelines
- Real-time processing (Event Hub / Stream Analytics)
- Infrastructure as Code (Terraform / ARM templates)
- Data governance tools like Azure Purview
- Azure Data Engineer Associate (DP-203) certification
Educational Qualifications
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Skills: Azure Data Factory, Azure Databricks, Azure Synapse, Azure Data Lake Storage
Must-Haves
Azure Data Factory (4-6 years), Azure Databricks/PySpark (4-6 years), Azure Synapse Analytics (4-6 years), SQL/Spark SQL (4-6 years), Git/Azure DevOps (4-6 years)
Skills: Azure, Azure data factory, Python, Pyspark, Sql, Rest Api, Azure Devops
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
JOB DETAILS:
* Job Title: Associate III - Data Engineering
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4-6 years
* Location: Trivandrum, Kochi
Job Description
Job Title:
Data Services Engineer – AWS & Snowflake
Job Summary:
As a Data Services Engineer, you will be responsible for designing, developing, and maintaining robust data solutions using AWS cloud services and Snowflake.
You will work closely with cross-functional teams to ensure data is accessible, secure, and optimized for performance.
Your role will involve implementing scalable data pipelines, managing data integration, and supporting analytics initiatives.
Responsibilities:
• Design and implement scalable and secure data pipelines on AWS and Snowflake (Star/Snowflake schema)
• Optimize query performance using clustering keys, materialized views, and caching
• Develop and maintain Snowflake data warehouses and data marts.
• Build and maintain ETL/ELT workflows using Snowflake-native features (Snowpipe, Streams, Tasks).
• Integrate Snowflake with cloud platforms (AWS, Azure, GCP) and third-party tools (Airflow, dbt, Informatica)
• Utilize Snowpark and Python/Java for complex transformations
• Implement RBAC, data masking, and row-level security.
• Optimize data storage and retrieval for performance and cost-efficiency.
• Collaborate with stakeholders to gather data requirements and deliver solutions.
• Ensure data quality, governance, and compliance with industry standards.
• Monitor, troubleshoot, and resolve data pipeline and performance issues.
• Document data architecture, processes, and best practices.
• Support data migration and integration from various sources.
Qualifications:
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• 3 to 4 years of hands-on experience in data engineering or data services.
• Proven experience with AWS data services (e.g., S3, Glue, Redshift, Lambda).
• Strong expertise in Snowflake architecture, development, and optimization.
• Proficiency in SQL and Python for data manipulation and scripting.
• Solid understanding of ETL/ELT processes and data modeling.
• Experience with data integration tools and orchestration frameworks.
• Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
• AWS Glue, AWS Lambda, Amazon Redshift
• Snowflake Data Warehouse
• SQL & Python
Skills: Aws Lambda, AWS Glue, Amazon Redshift, Snowflake Data Warehouse
Must-Haves
AWS data services (4-6 years), Snowflake architecture (4-6 years), SQL (proficient), Python (proficient), ETL/ELT processes (solid understanding)
Skills: AWS, AWS lambda, Snowflake, Data engineering, Snowpipe, Data integration tools, orchestration framework
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
JOB DETAILS:
* Job Title: Lead I - (Web Api, C# .Net, .Net Core, Aws (Mandatory)
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 6 -9 years
* Location: Hyderabad
Job Description
Role Overview
We are looking for a highly skilled Senior .NET Developer who has strong experience in building scalable, high‑performance backend services using .NET Core and C#, with hands‑on expertise in AWS cloud services. The ideal candidate should be capable of working in an Agile environment, collaborating with cross‑functional teams, and contributing to both design and development. Experience with React and Datadog monitoring tools will be an added advantage.
Key Responsibilities
- Design, develop, and maintain backend services and APIs using .NET Core and C#.
- Work with AWS services (Lambda, S3, ECS/EKS, API Gateway, RDS, etc.) to build cloud‑native applications.
- Collaborate with architects and senior engineers on solution design and implementation.
- Write clean, scalable, and well‑documented code.
- Use Postman to build and test RESTful APIs.
- Participate in code reviews and provide technical guidance to junior developers.
- Troubleshoot and optimize application performance.
- Work closely with QA, DevOps, and Product teams in an Agile setup.
- (Optional) Contribute to frontend development using React.
- (Optional) Use Datadog for monitoring, logging, and performance metrics.
Required Skills & Experience
- 6+ years of experience in backend development.
- Strong proficiency in C# and .NET Core.
- Experience building RESTful services and microservices.
- Hands‑on experience with AWS cloud platform.
- Solid understanding of API testing using Postman.
- Knowledge of relational databases (SQL Server, PostgreSQL, etc.).
- Strong problem‑solving and debugging skills.
- Experience working in Agile/Scrum teams.
Good to Have
- Experience with React for frontend development.
- Exposure to Datadog for monitoring and logging.
- Knowledge of CI/CD tools (GitHub Actions, Jenkins, AWS CodePipeline, etc.).
- Containerization experience (Docker, Kubernetes).
Soft Skills
- Strong communication and collaboration abilities.
- Ability to work in a fast‑paced environment.
- Ownership mindset with a focus on delivering high‑quality solutions.
Skills
.NET Core, C#, AWS, Postman
Notice period - 0 to 15 days only
Location: Hyderabad
Virtual Interview: 7th Feb 2026
First round will be Virtual
2nd round will be F2F
Position Overview
We are seeking an experienced ERPNext/Frappe Developer to join our dynamic team at Dhwani. The ideal candidate will have strong expertise in developing, customizing, and maintaining ERPNext applications built on the Frappe Framework. This role involves working on complex business solutions, custom module development, and ensuring seamless integration with various business processes.
Key Responsibilities
Development & Customization
- Design, develop, and implement custom applications and modules on the Frappe Framework and ERPNext.
- Customize existing ERPNext modules (Accounting, CRM, HR, Inventory, Manufacturing, etc.) to meet specific business requirements.
- Build custom DocTypes, forms, reports, dashboards, and print formats.
- Develop and maintain REST APIs for system integrations.
- Write clean, efficient, and well-documented code in Python and JavaScript.
Technical Implementation
- Understand client requirements for ERPNext and suggest optimal technical solutions
- Handle all aspects of development including server-side, API, and client-side logic
- Implement business logic using Frappe's document lifecycle hooks and controllers
- Develop custom web portals, web pages, and web forms
- Ensure smooth transitions for customizations during Frappe/ERPNext upgrades
System Management
- Manage ERPNext installations, configurations, and deployments
- Perform system updates, upgrades, and maintenance
- Debug and troubleshoot technical issues, providing timely solutions
- Work with MariaDB/MySQL databases and write complex queries
- Implement and manage version control using Git
Collaboration & Documentation
- Collaborate with business analysts and stakeholders to gather and refine requirements
- Write functional and development specifications
- Participate in code reviews and contribute to development best practices
- Provide technical guidance and support to junior developers
Required Qualifications
Experience
- Minimum 2-4 years of hands-on experience with Frappe Framework and ERPNext development and customizations
- Proven track record of delivering live ERPNext projects that can be showcased
- Experience in customizing ERPNext modules across different business domains
- We are also open to hire Interns (With PPO Opportunity) who demonstrates strong DSA and coding fundamentals, good understanding of Python programming, knowledge and exposure of MySQL database, strong logical thinking, problem solving skills along with interest in working on frappe framework and enthusiasm to build challenging technology solutions for social impact. High-performing interns will receive a Pre-Placement Offer (PPO) based on performance. Internship will be of 3 months with monthly stipend in between 15k-20k based on interview performance.
Technical Skills
Core Technologies:
- Strong proficiency in Python programming
- Solid experience with JavaScript, HTML, CSS
- Working knowledge of Jinja templating.
- Experience with jQuery and Bootstrap framework
Frappe/ERPNext Expertise:
- Deep understanding of Frappe Framework architecture.
- Experience with DocType creation, customization, and management.
- Knowledge of Frappe's ORM, REST API capabilities, and hooks system.
- Understanding of ERPNext modules and business workflows
Database & Infrastructure:
- Proficient in MariaDB/MySQL database management.
- Experience with Linux operating systems.
- Knowledge of Git version control.
- Understanding of web server configurations and deployment.
Professional Skills
- Strong analytical and problem-solving abilities
- Excellent communication and collaboration skills
- Ability to work effectively in team environments
- Self-starter with ability to take ownership of projects
- Attention to detail and commitment to quality code
This is a work-from-office role in Gurgaon, Haryana
Job Description:
Exp Range - [5y to 10y]
Require Skills:
- Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
- Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
- Must Have – Experience working with models such as Random Forest, Kmeans clustering, BERT…
- Should Have – Exposure to querying warehouses and APIs
- Should Have – Experience with writing moderate to complex SQL queries
- Should Have – Experience analyzing and presenting data with BI tools or Excel
- Must Have – Very strong communication skills to work with technical and non-technical stakeholders in a global environment
Roles and Responsibilities:
- Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
- Analyze and present insights about the data and processes to Business Stakeholders
- Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
- Develop and deploy customized models on Production data sets to generate analytical insights and predictions
- Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
- Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
- Share knowledge and best practices with broader teams to make everyone aware and more productive.
Qualifications:
- Minimum bachelor's degree in engineering or computer applications or AI/Data science
- Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
- Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred
AccioJob is conducting a Walk-In Hiring Drive with Global IT Consulting for the position of Software Engineer.
To apply, register and select your slot here: https://go.acciojob.com/6ED2rL
Required Skills: DSA, SQL, OOPS
Eligibility:
Degree: BTech./BE
Branch: Computer Science/CSE/Other CS related branch, IT
Graduation Year: 2024, 2025
Work Details:
Work Location: Bangalore (Onsite)
CTC: ₹11.1 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre
Further Rounds (for shortlisted candidates only):
Coding Assignment, Technical Interview 1, Technical Interview 2, Technical Interview 3
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/eapv4u
👇 FAST SLOT BOOKING 👇
[ 📲 DOWNLOAD ACCIOJOB APP ]
Proficiency in Java 8+.
Solid understanding of REST APIs(Spring boot), microservices, databases (SQL/NoSQL), and caching systems like Redis/Aerospike.
Familiarity with cloud platforms (AWS, GCP, Azure) and DevOps tools (Docker, Kubernetes, CI/CD).
Good understanding of data structures, algorithms, and software design principles.
About the company:
At Estuate, more than 400 uniquely talented people work together, to provide the world with next-generation product engineering and IT enterprise services. We help companies reimagine their business for the digital age.
Incorporated in 2005 in Milpitas (CA), we have grown to become a global organization with a truly global vision. At Estuate, we bring together talent, experience, and technology to meet our customer’s needs. Our ‘Extreme Service’ culture helps us deliver extraordinary results.
Our key to success:
We are an ISO-certified organization present across four distinct global geographies. We cater to industry verticals such as BFSI, Healthcare & Pharma, Retail & E-Commerce, and ISVs/Startups, as well as having over 2,000 projects in our portfolio.
Our solution-oriented mindset fuels our offerings, including Platform Engineering, Business Apps, and Enterprise Security & GRC.
Our culture of oneness
At Estuate, we are committed to fostering an inclusive workplace that welcomes people from diverse social circumstances. Our diverse culture shapes our success stories. Our values unite us. And, our curiosity inspires our creativity. Now, if that sounds like the place you’d like to be, we look forward to hearing more from you.
Requirements:
Technical skills
- 8+ years of experience in a role Business or System or Functional Analyst;
- Proficient in writing User Stories, Use Cases, Functional and Non-Functional requirements, system diagrams, wireframes;
- Experience of working with Restful APIs (writing requirements, API usage);
- Experience in Microservices architecture;
- Experience of working with Agile methodologies (Scrum, Kanban);
- Knowledge of SQL;
- Knowledge of UML, BPMN;
- Understanding of key UX/UI practices and processes;
- Understanding of software development lifecycle;
- Understanding of architecture of WEB-based application;
- English Upper-Intermediate or higher.
Soft Skills
- Excellent communication and presentation skills;
- Proactiveness;
- Organized, detail-oriented with ability to keep overall solution in mind;
- Comfort working in a fast-paced environment, running concurrent projects and manage BA work with multiple stakeholders;
- Good time-management skills, ability to handle multitasking activities.
Good to haves
- Experience in enterprise software development or finance domain;
- Experience in delivery of desktop and web-applications;
- Experience of successful system integration project.
Responsibilities:
- Participation in discovery phases and workshops with Customer, covering key business and product requirements;
- Manage project scope, requirements management and their impact on existing requirements, defining dependencies on other teams;
- Creating business requirements, user stories, mockups, functional specifications and technical requirements (incl. flow diagrams, data mappings, examples);
- Close collaboration with development team (requirements presentation, backlog grooming, requirements change management, technical solution design together with Tech Lead, etc.);
- Regular communication with internal (Product, Account management, Business teams) and external stakeholders (Partners, Customers);
- Preparing UAT scenarios, validation cases;
- User Acceptance Testing;
- Demo for internal stakeholders;
- Creating documentation (user guides, technical guides, presentations).
Project Description:
Wireless Standard POS (Point-Of-Sales) is our retail management solution for the Telecom Market.
It provides thousands of retailers with features and functionalities they need to run their businesses effectively with full visibility and control into every aspect of sales and operations. It is simple to learn, easy to use and as operation grows, more features can be added on.
Our system can optimize and simplify all processes related to retail in this business area.
Few things that our product can do:
- Robust Online Reporting
- Repair Management Software
- 3rd Party Integrations
- Customer Connect Marketing
- Time and Attendance
- Carrier Commission Reconciliation
As a Business Analyst/ System Analyst, you will be the liaison between the lines of business and the Development team, have the opportunity to work on a very complex product with microservice architecture (50+ for now) and communicate with Product, QA, Developers, Architecture and Customer Support teams to help improve product quality.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design and develop backend components and sub-systems for large-scale platforms under guidance from senior engineers.
- Contribute to building and evolving the next-generation customer data platform.
- Write clean, efficient, and well-tested code with a focus on scalability and performance.
- Explore and experiment with modern technologies—especially open-source frameworks—
- and build small prototypes or proof-of-concepts.
- Use AI-assisted development tools to accelerate coding, testing, debugging, and learning while adhering to engineering best practices.
- Participate in code reviews, design discussions, and continuous improvement of the platform.
Qualifications:
- 0–2 years of experience (or strong academic/project background) in backend development with Java.
- Good fundamentals in algorithms, data structures, and basic performance optimizations.
- Bachelor’s or Master’s degree in Computer Science or IT (B.E / B.Tech / M.Tech / M.S) from premier institutes.
Technical Skill Set:
- Strong aptitude and analytical skills with emphasis on problem solving and clean coding.
- Working knowledge of SQL and NoSQL databases.
- Familiarity with unit testing frameworks and writing testable code is a plus.
- Basic understanding of distributed systems, messaging, or streaming platforms is a bonus.
AI-Assisted Engineering (LLM-Era Skills):
- Familiarity with modern AI coding tools such as Cursor, Claude Code, Codex, Windsurf, Opencode, or similar.
- Ability to use AI tools for code generation, refactoring, test creation, and learning new systems responsibly.
- Willingness to learn how to combine human judgment with AI assistance for high-quality engineering outcomes.
Soft Skills & Nice to Have
- Appreciation for technology and its ability to create real business value, especially in data and marketing platforms.
- Clear written and verbal communication skills.
- Strong ownership mindset and ability to execute in fast-paced environments.
- Prior internship or startup experience is a plus.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design, build, and own large-scale, distributed backend and platform systems.
- Drive architectural decisions for high-throughput, low-latency services with strong scalability and reliability guarantees.
- Build and evolve core components of a real-time Customer Data Platform, especially around data ingestion, streaming, and processing.
- Evaluate and adopt open-source and emerging platform technologies; build prototypes where needed.
- Own critical subsystems end-to-end, ensuring performance, maintainability, and operational excellence.
- Mentor junior engineers and uphold high standards through code and design reviews.
- Effectively use modern AI-assisted coding tools to accelerate development while maintaining engineering rigor.
- 4–6 years of strong backend/platform engineering experience with solid fundamentals in algorithms, data structures, and optimizations.
- Proven experience designing and operating production-grade distributed systems.
- B.E / B.Tech / M.Tech / M.S / MCA in Computer Science or equivalent from premier institutes.
Qualifications:
Technical Skills:
- Strong system and object-oriented design skills.
- Hands-on experience with SQL and NoSQL databases.
- Strong working knowledge of Kafka and streaming systems.
- Proficiency in Java, concurrency, and unit/integration testing.
- Familiarity with cloud-native environments (Kubernetes, CI/CD, observability).
AI-Assisted Engineering:
- Hands-on experience using modern AI coding platforms such as Opencode, Claude Code, Codex, Cursor, Windsurf, or similar.
- Ability to use AI tools for code generation, refactoring, testing, debugging, and design exploration responsibly.
Soft Skills & Nice to Have:
- Strong ownership mindset and ability to deliver in fast-paced environments.
- Clear written and verbal communication skills.
- Startup experience is a plus.
Key Responsibilities
- Architectural Leadership: Design end-to-end agentic frameworks using UiPath Agent Builder and Studio Web, moving processes away from rigid "if-then" logic to goal-oriented AI agents.
- Agentic UI Integration: Lead the implementation of UiPath Autopilot and Agentic Orchestration to handle dynamic UI changes, unstructured data, and complex human-in-the-loop escalations.
- Advanced AI Implementation: Deploy and fine-tune models within the UiPath AI Trust Layer, ensuring secure and governed use of LLMs (GPT-4, Claude, etc.) for real-time UI reasoning.
- Infrastructure & Governance: Define the "Agentic Operating Model," establishing guardrails for autonomous agent behavior, security protocols, and scalability within UiPath Orchestrator.
- Stakeholder Strategy: Partner with C-suite stakeholders to identify "Agent-First" opportunities that provide 10x ROI over traditional RPA.
- Mentorship: Lead a CoE (Center of Excellence), upskilling senior developers in prompt engineering, context grounding, and semantic automation.
Required Technical Skills
- Core Platform: Expert-level mastery of UiPath Studio, Orchestrator, and Cloud.
- Agentic Specialization: Proven experience with UiPath Agent Builder, Integration Service, and Document Understanding.
- AI/ML Integration: Deep understanding of AI Center, Vector Databases, and RAG (Retrieval-Augmented Generation) to provide agents with business context.
- Programming: Proficiency in .NET (C#) and Python for custom activity development and AI model interfacing.
- UI Automation: Expert knowledge of modern UI descriptors, Computer Vision, and handling "tricky" environments (Citrix, legacy SAP, mainframe).
Qualifications
- Experience: 10+ years in Software Development/Automation, with at least 5 years specifically in UiPath Architecture.
- Education: Bachelor’s or Master’s in Computer Science, AI, or a related field.
- Certifications: UiPath Advanced Developer (ARD) and UiPath Solution Architect certifications are mandatory. Certifications in AI/ML (Azure AI, AWS Machine Learning) are a significant plus.
- Mindset: A "fail-forward" approach to innovation, with the ability to prototype agentic solutions in fast-paced environments.
We’re hiring a remote, contract-based Backend & Infrastructure Engineer who can build and run production systems end-to-end.
You will build and scale high-throughput backend services in Golang and Python, operate ClickHouse-powered analytics at scale, manage Linux servers for maximum uptime, scalability, and reliability, and drive cost efficiency as a core engineering discipline across the entire stack.
What You Will Do:
Backend Development (Golang & Python)
- Design and maintain high-throughput RESTful/gRPC APIs — primarily Golang, Python for tooling and supporting services
- Architect for horizontal scalability, fault tolerance, and low-latency at scale
- Implement caching (Redis/Memcached), rate limiting, efficient serialization, and CI/CD pipelines
Scalable Architecture & System Design
- Design and evolve distributed, resilient backend architecture that scales without proportional cost increase
- Make deliberate trade-offs (CAP, cost vs. performance) and design multi-region HA with automated failover
ClickHouse & Analytical Data Infrastructure
- Deploy, tune, and operate ClickHouse clusters for real-time analytics and high-cardinality OLAP workloads
- Design optimal table engines, partition strategies, materialized views, and query patterns
- Manage cluster scaling, replication, schema migrations, and upstream/downstream integrations
Cost Efficiency & Cost Optimization
- Own cost optimization end-to-end: right-sizing, reserved/spot capacity, storage tiering, query optimization, compression, batching
- Build cost dashboards, budgets, and alerts; drive a culture of cost-aware engineering
Linux Server Management & Infrastructure
- Administer and harden Linux servers (Ubuntu, Debian, CentOS/RHEL) — patching, security, SSH, firewalls
- Manage VPS/bare-metal provisioning, capacity planning, and containerized workloads (Docker, Kubernetes/Nomad)
- Implement Infrastructure-as-Code (Terraform/Pulumi); optionally manage AWS/GCP as needed
Data, Storage & Scheduling
- Optimize SQL schemas and queries (PostgreSQL, MySQL); manage data archival, cold storage, and lifecycle policies
- Build and maintain cron jobs, scheduled tasks, and batch processing systems
Uptime, Reliability & Observability
- Own system uptime: zero-downtime deployments, health checks, self-healing infra, SLOs/SLIs
- Build observability stacks (Prometheus, Grafana, Datadog, OpenTelemetry); structured logging, distributed tracing, alerting
- Drive incident response, root cause analysis, and post-mortems
Required Qualifications:
Must-Have (Critical)
- Deep proficiency in Golang (primary) and Python
- Proven ability to design and build scalable, distributed architectures
- Production experience deploying and operating ClickHouse at scale
- Track record of driving measurable cost efficiency and cost optimization
- 5+ years in backend engineering and infrastructure roles
Also Required
- Strong Linux server administration (Ubuntu, Debian, CentOS/RHEL) — comfortable living in the terminal
- Proven uptime and reliability track record across production infrastructure
- Strong SQL (PostgreSQL, MySQL); experience with high-throughput APIs (10K+ RPS)
- VPS/bare-metal provisioning, Docker, Kubernetes/Nomad, IaC (Terraform/Pulumi)
- Observability tooling (Prometheus, Grafana, Datadog, OpenTelemetry)
- Cron jobs, batch processing, data archival, cold storage management
- Networking fundamentals (DNS, TCP/IP, load balancing, TLS)
Nice to Have
- AWS, GCP, or other major cloud provider experience
- Message queues / event streaming (Kafka, RabbitMQ, SQS/SNS)
- Data pipelines (Airflow, dbt); FinOps practices
- Open-source contributions; compliance background (SOC 2, HIPAA, GDPR)
What We Offer
- Remote, contractual role
- Flexible time zones (overlap for standups + incident coverage)
- Competitive contract compensation + equity
- Long-term engagement opportunity based on performance
Microsoft Fabric, Power BI, Data modelling, ETL, Spark SQL
Remote work- 5-7 hours
450 Rs hourly charges
What You’ll Do:
We are looking for a Staff Operations Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.
This role will be in the Engineering Operations team and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.
- Serve as the Engineering interface between Analytics and Engineering teams.
- Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
- Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
- Build “mastered” versions of the data for Analytics-specific querying use cases.
- Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent.
- Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
- Implement DataOps practices.
- Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
- Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
- Operate between Engineers and Analysts to unify both practices for analytics insight creation.
Who You Are:
- 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
- Adept in market research methodologies and using data to deliver representative insights.
- Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
- Deep SQL experience is a must.
- Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
- English Language Fluency and proven success working with teams in the U.S.
- Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
- Experience working with public clouds like GCP/AWS.
- Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
- Proficient with SQL, Python or JVM-based language, Bash.
- Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc.
- Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
- Experience in debugging UI and Backend issues will be add on.
Hiring : Azure Data Engineer
Experience level: 5 yrs – 12yrs
Location : Bangalore
Work arrangement : On-site
Budget Range: Flexible
Mandatory Skill :
Self-Rating (7+ is must)
ADF, Databricks , Pyspark , SQL - Mandatory
Good to have :-
Delta Live table , Python , Team handling-
Manager ( 7+yrs exp) ,
Azure functions, Unity catalog, real-time streaming , Data pipelines
The Role
We are looking for a Senior/Lead Azure Data Engineer to join our team in Pune. You will be responsible for the end-to-end lifecycle of data solutions, from initial client requirement gathering and solution architecture design to leading the data engineering team through implementation. You will be the technical anchor for the project, ensuring that our data estates are scalable, governed, and high-performing.
Key Responsibilities
- Architecture & Design: Design robust data architectures using Microsoft Fabric and Azure Synapse, focusing on Medallion architecture and metadata-driven frameworks.
- End-to-End Delivery: Translate complex client business requirements into technical roadmaps and lead the team to deliver them on time.
- Data Governance: Implement and manage enterprise-grade governance, data discovery, and lineage using Microsoft Purview.
- Team Leadership: Act as the technical lead for the team, performing code reviews, mentoring junior engineers, and ensuring best practices in PySpark and SQL.
- Client Management: Interface directly with stakeholders to define project scope and provide technical consultancy.
What We’re Looking For
- 6+ Years in Data Engineering with at least 3+ years leading technical teams or designing architectures.
- Expertise in Microsoft Fabric/Synapse: Deep experience with Lakehouses, Warehouses, and Spark-based processing.
- Governance Specialist: Proven experience implementing Microsoft Purview for data cataloging, sensitivity labeling, and lineage.
- Technical Breadth: Strong proficiency in PySpark, SQL, and Data Factory. Familiarity with Infrastructure as Code (Bicep/Terraform) is a major plus.
Why Work with Us?
- Competitive Pay
- Flexible Hours
- Work on Microsoft’s latest (Fabric, Purview, Foundry) as a Designated Solutions Partner.
- High-Stakes Impact: Solve complex, client-facing problems for enterprise leaders
- Structured learning paths to help you master AI automation and Agentic AI.

Global digital transformation solutions provider.
JOB DETAILS:
Job Role: Lead I - Software Engineering - Java, Spring Boot, Microservices
Industry: Global digital transformation solutions provider
Work Mode: 3 days in office, Hybrid model.
Salary: Best in Industry
Experience: 5-7 years
Location: Trivandrum, Kochi, Thiruvananthapuram
Job Description
Job Title: Senior Java Developer Experience: 5+ years
Job Summary: We are looking for a Senior Java Developer with strong experience in Spring Boot and Microservices to work on high-performance applications for a leading financial services client. The ideal candidate will have deep expertise in Java backend development, cloud (preferably GCP), and strong problem-solving abilities.
Key Responsibilities:
• Develop and maintain Java-based microservices using Spring Boot
• Collaborate with Product Owners and teams to gather and review requirements
• Participate in design reviews, code reviews, and unit testing
• Ensure application performance, scalability, and security
• Contribute to solution architecture and design documentation
• Support Agile development processes including daily stand-ups and sprint planning
• Mentor junior developers and lead small modules or features
Required Skills:
• Java, Spring Boot, Microservices architecture
• GCP (or other cloud platforms like AWS)
• REST/SOAP APIs, Hibernate, SQL, Tomcat
• CI/CD tools: Jenkins, Bitbucket
• Agile methodologies (Scrum/Kanban)
• Unit testing (JUnit), debugging and troubleshooting
• Good communication and team leadership skills
Preferred Skills:
• Frontend familiarity (Angular, AJAX)
• Experience with API documentation tools (Swagger)
• Understanding of design patterns and UML
• Exposure to Confluence, Jira
Must-Haves
Java/J2EE (5+ years), Spring/Spring Boot (5+ years), Microservices (5+ years), AWS/GCP/Azure (mandatory), CI/CD (Jenkins, SonarQube, Git)
Mandatory Skills Required: Strong proficiency in Java, spring boot, microservices, GCP/AWS.
Experience Required: Minimum 5+ years of relevant experience
Java, Spring Boot, Microservices architecture
GCP (or other cloud platforms like AWS)
REST/SOAP APIs, Hibernate, SQL, Tomcat
CI/CD tools: Jenkins, Bitbucket
Agile methodologies (Scrum/Kanban)
Unit testing (JUnit), debugging and troubleshooting
Good communication and team leadership skills
Notice period - 0 to 15 days only (Immediate or candidates who are serving notice period and who can join by Feb)
Job stability is mandatory
Location: Trivandrum, Kochi
Virtual Interview: 31st Jan-Saturday
Nice to Haves
Frontend familiarity (Angular, AJAX)
Experience with API documentation tools (Swagger)
Understanding of design patterns and UML
Exposure to Confluence, Jira
Job Description:
Summary
The Data Engineer will be responsible for designing, developing, and maintaining the data infrastructure. They must have experience with SQL and Python.
Roles & Responsibilities:
● Collaborate with product, business, and engineering stakeholders to understand key metrics, data needs, and reporting pain points.
● Design, build, and maintain clean, scalable, and reliable data models using DBT.
● Write performant SQL and Python code to transform raw data into structured marts and reporting layers.
● Create dashboards using Tableau or similar tools.
● Work closely with data platform engineers, architects, and analysts to ensure data pipelines are resilient, well-governed, and high quality.
● Define and maintain source-of-truth metrics and documentation in the analytics layer.
● Partner with product engineering teams to understand new features and ensure appropriate
instrumentation and event collection.
● Drive reporting outcomes by building dashboards or working with BI teams to ensure timely delivery of insights.
● Help scale our analytics engineering practice by contributing to internal tooling, frameworks, and best practices.
Who You Are:
Experience : 3 to 4 years of experience in analytics/data engineering, with strong hands-on expertise in DBT, SQL, Python and dashboarding tools.
● Experience working with modern data stacks (e.g., Snowflake, BigQuery, Redshift, Airflow).
● Strong data modeling skills (dimensional, star/snowflake schema, data vault, etc.).
● Excellent communication and stakeholder management skills.
● Ability to work independently and drive business outcomes through data.
● Exposure to product instrumentation and working with event-driven data is a plus.
● Prior experience in a fast-paced, product-led company is preferred.
Deliver engaging classroom and/or online training sessions on topics including:
Python for Data Science Data Analytics using Excel and SQL
Statistics and Probability
Machine Learning and Deep Learning
Data Visualization using Power BI / Tableau
Create and update course materials, projects, assignments, and quizzes.
Provide hands-on training and real-world project guidance.
Evaluate student performance, provide constructive feedback, and track progress.
Stay updated with the latest trends, tools, and technologies in Data Science.
Mentor students during capstone projects and industry case studies.
Coordinate with the academic and operations team for batch planning and feedback.
Assist with the development of new courses and curriculum as needed.
We are seeking a Data Engineer with 3–4 years of relevant experience to join our team. The ideal candidate should have strong expertise in Python and SQL and be available to join immediately.
Location: Bangalore
Experience: 3–4 Years
Joining: Immediate Joiner preferred
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and data models
- Extract, transform, and load (ETL) data from multiple sources
- Write efficient and optimized SQL queries for data analysis and reporting
- Develop data processing scripts and automation using Python
- Ensure data quality, integrity, and performance across systems
- Collaborate with cross-functional teams to support business and analytics needs
- Troubleshoot data-related issues and optimize existing processes
Required Skills & Qualifications:
- 3–4 years of hands-on experience as a Data Engineer or similar role
- Strong proficiency in Python and SQL
- Experience working with relational databases and large datasets
- Good understanding of data warehousing and ETL concepts
- Strong analytical and problem-solving skills
- Ability to work independently and in a team-oriented environment
Preferred:
- Experience with cloud platforms or data tools (added advantage)
- Exposure to performance tuning and data optimization
We are Hiring ASP.NET MVC/Core Developers
Click Here to Apply : https://prishusoft.com/jobs/junior-aspnet-mvccore-professional
Experience Level
- 1–2 years of professional experience in web application development using ASP.NET MVC and ASP.NET Core.
Key Responsibilities
- Develop, maintain, and enhance web applications using ASP.NET MVC and ASP.NET Core.
- Write clean, scalable, and maintainable code following best practices.
- Design, develop, and integrate RESTful APIs with ASP.NET Web API.
- Collaborate with front-end developers and UI/UX designers to deliver exceptional user experiences.
- Work with MSSQL databases, including writing complex T-SQL queries, stored procedures, and optimizing performance.
- Participate in code reviews and contribute to technical discussions, architecture decisions, and performance improvements.
Technical Skills & Expertise
- Strong proficiency in ASP.NET MVC with at least 1 years of project experience.
- Good working knowledge of ASP.NET Core for modern application development.
- Solid skills in C#, JavaScript, and HTML.
- Experience with .NET Framework 4.5+.
- Hands-on experience with ASP.NET Web API development and consumption.
- Expertise in MSSQL (T-SQL, indexing, performance tuning).
Soft Skills
- Strong verbal and written communication skills.
- Collaborative team player with a willingness to share knowledge and contribute to team success.
Preferred / Bonus Skills
- Experience with Angular, React, or Vue.js for dynamic front-end development.
- Exposure to unit testing frameworks (e.g., Jasmine, Karma) for front-end applications.
- Understanding of DevOps practices and CI/CD pipelines.
- Familiarity with TypeScript for scalable JavaScript development.
Work Mode: Full-time On-site / Hybrid (Ahmedabad)
About the Role:
We are looking for a highly skilled Data Engineer with a strong foundation in Power BI, SQL, Python, and Big Data ecosystems to help design, build, and optimize end-to-end data solutions. The ideal candidate is passionate about solving complex data problems, transforming raw data into actionable insights, and contributing to data-driven decision-making across the organization.
Key Responsibilities:
Data Modelling & Visualization
- Build scalable and high-quality data models in Power BI using best practices.
- Define relationships, hierarchies, and measures to support effective storytelling.
- Ensure dashboards meet standards in accuracy, visualization principles, and timelines.
Data Transformation & ETL
- Perform advanced data transformation using Power Query (M Language) beyond UI-based steps.
- Design and optimize ETL pipelines using SQL, Python, and Big Data tools.
- Manage and process large-scale datasets from various sources and formats.
Business Problem Translation
- Collaborate with cross-functional teams to translate complex business problems into scalable, data-centric solutions.
- Decompose business questions into testable hypotheses and identify relevant datasets for validation.
Performance & Troubleshooting
- Continuously optimize performance of dashboards and pipelines for latency, reliability, and scalability.
- Troubleshoot and resolve issues related to data access, quality, security, and latency, adhering to SLAs.
Analytical Storytelling
- Apply analytical thinking to design insightful dashboards—prioritizing clarity and usability over aesthetics.
- Develop data narratives that drive business impact.
Solution Design
- Deliver wireframes, POCs, and final solutions aligned with business requirements and technical feasibility.
Required Skills & Experience:
- Minimum 3+ years of experience as a Data Engineer or in a similar data-focused role.
- Strong expertise in Power BI: data modeling, DAX, Power Query (M Language), and visualization best practices.
- Hands-on with Python and SQL for data analysis, automation, and backend data transformation.
- Deep understanding of data storytelling, visual best practices, and dashboard performance tuning.
- Familiarity with DAX Studio and Tabular Editor.
- Experience in handling high-volume data in production environments.
Preferred (Good to Have):
- Exposure to Big Data technologies such as:
- PySpark
- Hadoop
- Hive / HDFS
- Spark Streaming (optional but preferred)
Why Join Us?
- Work with a team that's passionate about data innovation.
- Exposure to modern data stack and tools.
- Flat structure and collaborative culture.
- Opportunity to influence data strategy and architecture decisions.
Database Programmer (SQL & Python)
Experience: 4 – 5 Years
Location: Remote
Employment Type: Full-Time
About the Opportunity
We are a mission-driven HealthTech organization dedicated to bridging the gap in global healthcare equity. By harnessing the power of AI-driven clinical insights and real-world evidence, we help healthcare providers and pharmaceutical companies deliver precision medicine to underrepresented populations.
We are looking for a skilled Database Programmer with a strong blend of SQL expertise and Python automation skills to help us manage, transform, and unlock the value of complex clinical data. This is a fully remote role where your work will directly contribute to improving patient outcomes and making life-saving treatments more affordable and accessible.
Key Responsibilities
- Data Architecture & Management: Design, develop, and maintain robust relational databases to store large-scale, longitudinal patient records and clinical data.
- Complex Querying: Write and optimize sophisticated SQL queries, stored procedures, and triggers to handle deep clinical datasets, ensuring high performance and data integrity.
- Python Automation: Develop Python scripts and ETL pipelines to automate data ingestion, cleaning, and transformation from diverse sources (EHRs, lab reports, and unstructured clinical notes).
- AI Support: Collaborate with Data Scientists to prepare datasets for AI-based analytics, Knowledge Graphs, and predictive modeling.
- Data Standardization: Map and transform clinical data into standardized models (such as HL7, FHIR, or proprietary formats) to ensure interoperability across healthcare ecosystems.
- Security & Compliance: Implement and maintain rigorous data security protocols, ensuring all database activities comply with global healthcare regulations (e.g., HIPAA, GDPR).
Required Skills & Qualifications
- Education: Bachelor’s degree in Computer Science, Information Technology, Statistics, or a related field.
- SQL Mastery: 4+ years of experience with relational databases (PostgreSQL, MySQL, or MS SQL Server). You should be comfortable with performance tuning and complex data modeling.
- Python Proficiency: Strong programming skills in Python, particularly for data manipulation (Pandas, NumPy) and database interaction (SQLAlchemy, Psycopg2).
- Healthcare Experience: Familiarity with healthcare data standards (HL7, FHIR) or experience working with Electronic Health Records (EHR) is highly preferred.
- ETL Expertise: Proven track record of building and managing end-to-end data pipelines for structured and unstructured data.
- Analytical Mindset: Ability to troubleshoot complex data issues and translate business requirements into efficient technical solutions.
To process your details please fill-out the google form.
About Company (GeniWay)
GeniWay Technologies is pioneering India’s first AI-native platform for personalized learning and career guidance, transforming the way students learn, grow, and determine their future path. Addressing challenges in the K-12 system such as one-size-fits-all teaching and limited career awareness, GeniWay leverages cutting-edge AI to create a tailored educational experience for every student. The core technology includes an AI-powered learning engine, a 24x7 multilingual virtual tutor and Clario, a psychometrics-backed career guidance system. Aligned with NEP 2020 policies, GeniWay is on a mission to make high-quality learning accessible to every student in India, regardless of their background or region.
What you’ll do
- Build the career assessment backbone: attempt lifecycle (create/resume/submit), timing metadata, partial attempts, idempotent APIs.
- Implement deterministic scoring pipelines with versioning and audit trails (what changed, when, why).
- Own Postgres data modeling: schemas, constraints, migrations, indexes, query performance.
- Create safe, structured GenAI context payloads (controlled vocabulary, safety flags, eval datasets) to power parent/student narratives.
- Raise reliability: tests for edge cases, monitoring, reprocessing/recalculation jobs, safe logging (no PII leakage).
Must-have skills
- Backend development in Python (FastAPI/Django/Flask) or Node (NestJS) with production API experience.
- Strong SQL + PostgreSQL fundamentals (transactions, indexes, schema design, migrations).
- Testing discipline: unit + integration tests for logic-heavy code; systematic debugging approach.
- Comfort using AI coding copilots to speed up scaffolding/tests/refactors — while validating correctness.
- Ownership mindset: cares about correctness, data integrity, and reliability.
Good to have
- Experience with rule engines, scoring systems, or audit-heavy domains (fintech, healthcare, compliance).
- Event schemas/telemetry pipelines and observability basics.
- Exposure to RAG/embeddings/vector DBs or prompt evaluation harnesses.
Location: Pune (on-site for first 3 months; hybrid/WFH flexibility thereafter)
Employment Type: Full-time
Experience: 2–3 years (correctness-first; strong learning velocity)
Compensation: Competitive (₹8–10 LPA fixed cash) + ESOP (equity ownership, founding-early employee level)
Joining Timeline: 2–3 weeks / Immediate
Why join (founding team)
- You’ll build core IP: scoring integrity and data foundations that everything else depends on.
- Rare skill-building: reliable systems + GenAI-safe context/evals (not just API calls).
- Meaningful ESOP upside at an early stage.
- High trust, high ownership, fast learning.
- High-impact mission: reduce confusion and conflict in student career decisions; help families make better choices, transform student lives by making great learning personal.
Hiring process (fast)
1. 20-min intro call (fit + expectations).
2. 45–60 min SQL & data modeling, API deep dive.
3. Practical exercise (2–3 hours max) implementing a small scoring service with tests.
4. Final conversation + offer.
How to apply
Reply with your resume/LinkedIn profile plus one example of a system/feature where you owned data modeling and backend integration (a short paragraph is fine).
Required Skills and Qualifications:
- 2–3 years of professional experience in Python development.
- Strong understanding of object-oriented programming.
- Experience with frameworks such as Django, Flask, or FastAPI.
- Knowledge of REST APIs, JSON, and web integration.
- Familiarity with SQL and database management systems.
- Experience with Git or other version control tools.
- Good problem-solving and debugging skills.
- Strong communication and teamwork abilities.
Position: Insights Manager
Location: Gurugram (Onsite)
Experience Required: 4+ Years
Working Days: 5 Days (Mon to Fri)
About the Role
We are seeking a hands-on Insights Manager to build the analytical backbone that powers decision-making. This role sits at the centre of the data ecosystem, partnering with Category, Commercial, Marketing, Sourcing, Fulfilment, Product, and Growth teams to translate data into insight, automation, and action.
You will design self-running reporting systems, maintain data quality in collaboration with data engineering, and build analytical models that directly improve pricing, customer experience, and operational efficiency. The role requires strong e-commerce domain understanding and the ability to move from data to decisions with speed and precision.
Key Responsibilities
1. Data Platform & Governance
- Partner with data engineering to ensure clean and reliable data across Shopify, GA4, Ad platforms, CRM, and ERP systems
- Define and maintain KPI frameworks (ATC, CVR, AOV, Repeat Rate, Refunds, LTV, CAC, etc.)
- Oversee pipeline monitoring, QA checks, and metric documentation
2. Reporting, Dashboards & Automation
- Build automated datamarts and dashboards for business teams
- Integrate APIs and automate data flows across multiple sources
- Create actionable visual stories and executive summaries
- Use AI and automation tools to improve insight delivery speed
3. Decision Models & Applied Analytics
- Build models for pricing, discounting, customer segmentation, inventory planning, delivery SLAs, and recommendations
- Translate analytics outputs into actionable playbooks for internal teams
4. Insights & Actionability
- Diagnose performance shifts and identify root causes
- Deliver weekly and monthly insight-driven recommendations
- Improve decision-making speed and quality across functions
Qualifications & Experience
- 4–7 years of experience in analytics or product insights (e-commerce / D2C / retail)
- Strong SQL and Python skills
- Hands-on experience with GA4, GTM, and dashboarding tools (Looker / Tableau / Power BI)
- Familiarity with CRM platforms like Klaviyo, WebEngage, or MoEngage
- Strong understanding of e-commerce KPIs and customer metrics
- Ability to communicate insights clearly to non-technical stakeholders
What We Offer
- Greenfield opportunity to build the data & insights platform from scratch
- High business impact across multiple functions
- End-to-end exposure from analytics to automation and applied modelling
- Fast-paced, transparent, and collaborative work culture
Company Description
NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.
Role Description
This is a full-time hybrid role for a Java Software Engineer, based in Pune. The Java Software Engineer will be responsible for designing, developing, and maintaining software applications. Key responsibilities include working with microservices architecture, implementing and managing the Spring Framework, and programming in Java. Collaboration with cross-functional teams to define, design, and ship new features is also a key aspect of this role.
Responsibilities:
● Develop and Maintain: Write clean, efficient, and maintainable code for Java-based applications
● Collaborate: Work with cross-functional teams to gather requirements and translate them into technical solutions
● Code Reviews: Participate in code reviews to maintain high-quality standards
● Troubleshooting: Debug and resolve application issues in a timely manner
● Testing: Develop and execute unit and integration tests to ensure software reliability
● Optimize: Identify and address performance bottlenecks to enhance application performance
Qualifications & Skills:
● Strong knowledge of Java, Spring Framework (Spring Boot, Spring MVC), and Hibernate/JPA
● Familiarity with RESTful APIs and web services
● Proficiency in working with relational databases like MySQL or PostgreSQL
● Practical experience with AWS cloud services and building scalable, microservices-based architectures
● Experience with build tools like Maven or Gradle
● Understanding of version control systems, especially Git
● Strong understanding of object-oriented programming principles and design patterns
● Familiarity with automated testing frameworks and methodologies
● Excellent problem-solving skills and attention to detail
● Strong communication skills and ability to work effectively in a collaborative team environment
Why Join Us?
● Opportunity to work on cutting-edge technology products
● A collaborative and learning-driven environment
● Exposure to AI and software engineering innovations
● Excellent work ethic and culture
If you're passionate about technology and want to work on impactful projects, we'd love to hear from you
Job Summary
We are looking for an experienced Python DBA with strong expertise in Python scripting and SQL/NoSQL databases. The candidate will be responsible for database administration, automation, performance optimization, and ensuring availability and reliability of database systems.
Key Responsibilities
- Administer and maintain SQL and NoSQL databases
- Develop Python scripts for database automation and monitoring
- Perform database performance tuning and query optimization
- Manage backups, recovery, replication, and high availability
- Ensure data security, integrity, and compliance
- Troubleshoot and resolve database-related issues
- Collaborate with development and infrastructure teams
- Monitor database health and performance
- Maintain documentation and best practices
Required Skills
- 10+ years of experience in Database Administration
- Strong proficiency in Python
- Experience with SQL databases (PostgreSQL, MySQL, Oracle, SQL Server)
- Experience with NoSQL databases (MongoDB, Cassandra, etc.)
- Strong understanding of indexing, schema design, and performance tuning
- Good analytical and problem-solving skills
Key Responsibilities
- Develop and maintain applications using Java 8/11/17, Spring Boot, and REST APIs.
- Design and implement microservices and backend components.
- Work with SQL/NoSQL databases, API integrations, and performance optimization.
- Collaborate with cross-functional teams and participate in code reviews.
- Deploy applications using CI/CD, Docker, Kubernetes, and cloud platforms (AWS/Azure/GCP).
Skills Required
- Strong in Core Java, OOPS, multithreading, collections.
- Hands-on with Spring Boot, Hibernate/JPA, Microservices.
- Experience with REST APIs, Git, and CI/CD pipelines.
- Knowledge of Docker/Kubernetes and cloud basics.
- Good understanding of database queries and performance tuning.
Nice to Have:
- Experience with messaging systems (Kafka/RabbitMQ).
- Basic frontend understanding (React/Angular).
Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.
Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.
Handling the challenges and breaking.
Must have Advanced Excel skills
Should have 3+ years of relevant experience
Should have Reporting + dashboard creation experience
Should have Database development & maintenance experience
Must have Strong communication for client interactions
Should have Ability to work independently
Willingness to work from client locati
Forbes Advisor is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.
We do this by combining data-driven content, rigorous product comparisons, and user-first design all built on top of a modern, scalable platform. Our teams operate globally and bring deep expertise across journalism, product, performance marketing, and analytics.
The Role
We are hiring a Senior Data Engineer to help design and scale the infrastructure behind our analytics,performance marketing, and experimentation platforms.
This role is ideal for someone who thrives on solving complex data problems, enjoys owning systems end-to-end, and wants to work closely with stakeholders across product, marketing, and analytics.
You’ll build reliable, scalable pipelines and models that support decision-making and automation at every level of the business.
What you’ll do
● Build, maintain, and optimize data pipelines using Spark, Kafka, Airflow, and Python
● Orchestrate workflows across GCP (GCS, BigQuery, Composer) and AWS-based systems
● Model data using dbt, with an emphasis on quality, reuse, and documentation
● Ingest, clean, and normalize data from third-party sources such as Google Ads, Meta,Taboola, Outbrain, and Google Analytics
● Write high-performance SQL and support analytics and reporting teams in self-serve data access
● Monitor and improve data quality, lineage, and governance across critical workflows
● Collaborate with engineers, analysts, and business partners across the US, UK, and India
What You Bring
● 4+ years of data engineering experience, ideally in a global, distributed team
● Strong Python development skills and experience
● Expert in SQL for data transformation, analysis, and debugging
● Deep knowledge of Airflow and orchestration best practices
● Proficient in DBT (data modeling, testing, release workflows)
● Experience with GCP (BigQuery, GCS, Composer); AWS familiarity is a plus
● Strong grasp of data governance, observability, and privacy standards
● Excellent written and verbal communication skills
Nice to have
● Experience working with digital marketing and performance data, including:
Google Ads, Meta (Facebook), TikTok, Taboola, Outbrain, Google Analytics (GA4)
● Familiarity with BI tools like Tableau or Looker
● Exposure to attribution models, media mix modeling, or A/B testing infrastructure
● Collaboration experience with data scientists or machine learning workflows
Why Join Us
● Monthly long weekends — every third Friday off
● Wellness reimbursement to support your health and balance
● Paid parental leave
● Remote-first with flexibility and trust
● Work with a world-class data and marketing team inside a globally recognized brand
About Kanerika:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Role Responsibilities:
Following are high level responsibilities that you will play but not limited to:
- Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
- Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
- Enable business analytics and self-service reporting through Power BI and other visualization tools.
- Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
- Implement and enforce best practices for data governance, data quality, and security.
- Mentor and guide junior data engineers; establish coding and design standards.
- Evaluate emerging technologies and tools to continuously improve the data ecosystem.
Required Qualifications:
- Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
- Bachelor’s/ Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- 4-11 years of experience in data engineering or data platform development
- Strong hands-on experience in SQL, Snowflake, Python, and Airflow
- Solid understanding of data modeling, data governance, security, and CI/CD practices.
Preferred Qualifications:
- Familiarity with data modeling techniques and practices for Power BI.
- Knowledge of Azure Databricks or other data processing frameworks.
- Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.




















