Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
TalentXO
Remote only
6 - 10 yrs
₹30L - ₹40L / yr
Agentic AI
Data Product Designer
AI/ML
UX
skill iconFigma
+4 more

Role & Responsibilities

Own the user experience for Dentsu's AI-powered agentic tools and client-facing data products. This is a senior design role responsible for making complex multi-agent systems, Genie spaces, and automated workflows feel simple and intuitive for media teams and clients who are not technical. You will work at the intersection of AI capability and human usability, designing the interfaces that turn powerful backend intelligence into tools people actually want to use.

Key Responsibilities-

  • Lead end-to-end design for agentic AI products: from discovery and user research through wireframes, prototypes, and production-ready specs
  • Design intuitive interfaces for multi-agent systems that serve media planners, analysts, and clients with varying levels of technical sophistication
  • Create UX flows for Genie spaces, conversational data exploration, and automated reporting dashboards that surface insights without requiring SQL or code
  • Develop and maintain a design system for the Decisioning practice's AI product suite, ensuring visual and interaction consistency across all tools
  • Conduct user research with internal media teams and client stakeholders to identify pain points, map workflows, and validate design decisions
  • Design transparency and trust patterns for AI-driven experiences: how users understand what the system did, why, and how to correct it
  • Prototype and test interaction models for agent-to-human handoff, error recovery, and multi-step automated workflows
  • Collaborate closely with AI engineers and data scientists to ensure designs are technically feasible and ship at high fidelity
  • Design onboarding flows and training materials that accelerate adoption of new AI tools across agencies
  • Create client-facing presentation materials, demos, and visual assets that communicate tool capabilities and business value

Ideal Candidate

  • Strong Agentic AI & Data Product Designer Profile
  • Mandatory (Experience 1): Must have 6+ years of total experience in design, with 5+ years in Product Design for data-heavy or complex digital products — enterprise dashboards, analytics tools, workflow platforms, or similar complex environments — with shipped work at scale.
  • Mandatory (Experience 2): Must have 6 months+ experience designing for AI/ML-powered products like gen ai features, agentic ai related features, AI automation tools etc
  • Mandatory (Skill 1): Must have demonstrated expertise in complex workflow design, data visualization, and enterprise UX at scale — designing interfaces that surface insights and enable non-technical users to navigate powerful backend systems
  • Mandatory (Skill 2): Must have strong understanding of design systems and component-based design methodology, with experience building, contributing to, or maintaining systems that ensure visual and interaction consistency across a product suite
  • Mandatory (Skill 3 ): Must have the ability to design transparency and trust patterns for AI-driven experiences — including how users understand what the system did, why, and how to correct it; plus interaction models for agent-to-human handoff, error recovery, and multi-step automated workflows
  • Mandatory (Tools): Must have deep proficiency in Figma, including component libraries, auto-layout, and interactive prototyping
  • Mandatory (Stakeholder Mgmt & Communication): Must have excellent communication skills for presenting design rationale to engineering, product, and business stakeholders
  • Mandatory (Portfolio): Must have a strong portfolio demonstrating complex workflow design, data visualization work, and ideally AI/agentic or conversational interface projects.
  • Preferred (AI Interaction Design): Experience specifically designing chatbot, copilot, or agent-based interaction patterns
  • Preferred (Industry): Experience in media, advertising, or marketing technology industries


Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 12 yrs
Best in industry
skill iconPython
SQL
ETL
Google Cloud Platform (GCP)
Windows Azure
+1 more

We are seeking a skilled Data Engineer to join the AI Platform Capabilities team supporting the UDP Uplift program.

In this role, you will design, build, and test standardized data and AI platform capabilities across a multi-cloud environment (Azure & GCP).

You will collaborate closely with AI use case teams to develop:

  • Scalable data pipelines
  • Reusable data products
  • Foundational data infrastructure

Your work will support advanced AI solutions such as:

  • GenAI
  • RAG (Retrieval-Augmented Generation)
  • Document Intelligence

Key Responsibilities

  • Design and develop scalable ETL/ELT pipelines for AI workloads
  • Build and optimize data pipelines for structured & unstructured data
  • Enable context processing & vector store integrations
  • Support streaming data workflows and batch processing
  • Ensure adherence to enterprise data models, governance, and security standards
  • Collaborate with DataOps, MLOps, Security, and business teams (LBUs)
  • Contribute to data lifecycle management for AI platforms

Required Skills

  • 5–7 years of hands-on experience in Data Engineering
  • Strong expertise in Python and advanced SQL
  • Experience with GCP and/or Azure cloud-native data services
  • Hands-on experience with PySpark / Spark SQL
  • Experience building data pipelines for ML/AI workloads
  • Understanding of CI/CD, Git, and Agile methodologies
  • Knowledge of data quality, governance, and security practices
  • Strong collaboration and stakeholder management skills

Nice-to-Have Skills

  • Experience with Vector Databases / Vector Stores (for RAG pipelines)
  • Familiarity with MLOps / GenAIOps concepts (feature stores, model registries, prompt management)
  • Exposure to Knowledge Graphs / Context Stores / Document Intelligence workflows
  • Experience with DBT (Data Build Tool)
  • Knowledge of Infrastructure-as-Code (Terraform)
  • Experience in multi-cloud deployments (Azure + GCP)
  • Familiarity with event-driven systems (Kafka, Pub/Sub) & API integrations

Ideal Candidate Profile

  • Strong data engineering foundation with AI/ML exposure
  • Experience working in multi-cloud environments
  • Ability to build production-grade, scalable data systems
  • Comfortable working in cross-functional, fast-paced environments
Read more
Bengaluru (Bangalore)
2 - 5 yrs
₹20.4L - ₹24L / yr
skill iconPython
API
SQL
Systems design
Software deployment

Location: Bangalore

Experience: 2–5 years

Type: Full-time | On-site

Open Roles: 2

Start: Immediate

Why this role exists

Most systems work at a low scale.

Very few survive real production load, complex workflows, and enterprise edge cases.

We are building a platform that must:

  • Scale from 500K → 20M+ interactions/month
  • Handle complex insurance workflows reliably
  • Become easier to deploy as it grows, not harder

This role exists to build the backend foundation that makes this possible.

What you’ll do

You will not just write services.

You will design and own core platform systems.

1. Scale the platform without breaking architecture

  • Scale from 50K → 2M+ interactions/month
  • Ensure:
  • High availability
  • Low latency
  • Fault tolerance
  • Avoid large rewrites — build systems that evolve cleanly

2. Build the workflow automation (WA) engine

  • Design a flexible system with:
  • States
  • Stages
  • Cohorts
  • Dynamic workflows
  • Ensure workflows:
  • Handle edge cases reliably
  • Can be configured easily
  • Move from:
  • Hardcoded flows → configurable execution engine

3. Build the insurance-specific data layer

  • Design data models for:
  • Policy states
  • Claim workflows
  • Consent tracking
  • Ensure the system works across:
  • Multiple insurers
  • Multiple use cases
  • Build a platform-first data layer, not use-case-specific hacks

4. Make deployment and setup simple

  • Ensure workflows and data models are:
  • Easy to configure
  • Easy to launch
  • Reduce friction for:
  • Product teams
  • Deployment teams

5. Create a compounding data advantage

  • Build a data layer that:
  • Improves with every deployment
  • Captures structured signals
  • Ensure data becomes a long-term edge, not just storage

6. Own production reliability

  • Participate in on-call rotation across 3 engineers
  • Ensure:
  • Incidents are handled quickly
  • Root causes are fixed permanently
  • Build systems where reliability is shared, not individual

What success looks like

  • Platform scales to 2M+ interactions/month smoothly
  • Workflow engine supports complex, dynamic use cases
  • Data layer enables fast deployment across accounts
  • Edge cases are handled without constant firefighting
  • System becomes easier to use as it grows
  • Production issues are rare and predictable

Who you are

  • You have 2-5 years of backend engineering experience
  • You have built:
  • Scalable systems
  • Distributed services
  • You think in:
  • Systems
  • Data models
  • Trade-offs
  • You are comfortable owning:
  • Architecture
  • Production systems

What will make you stand out

  • Experience building:
  • Workflow engines
  • State machines
  • Data-heavy platforms
  • Strong understanding of:
  • System design
  • Distributed systems
  • Failure handling
  • Experience working in:
  • High-scale production environments

Why join

  • You will build the core backend of an AI platform
  • Your work directly impacts:
  • Scale
  • Reliability
  • Product capability
  • You will design systems that move from:
  • Use-case specific → platform-level infrastructure

What this role is not

  • Not just API development
  • Not limited to feature-level work
  • Not disconnected from production realities

What this role is

  • A system architect
  • A builder of scalable platforms
  • A driver of long-term technical advantage

One question to self-evaluate

Can you design backend systems that scale, handle edge cases, and become easier to use as they grow?


Read more
LearnTube.ai

at LearnTube.ai

2 candid answers
Vinayak Sharan
Posted by Vinayak Sharan
Remote, Mumbai
3 - 6 yrs
₹14L - ₹32L / yr
skill iconPython
FastAPI
skill iconDocker
skill iconAmazon Web Services (AWS)
SQL
+3 more

Role Overview:


As a Backend Developer at LearnTube.ai, you will ship the backbone that powers 2.3 million learners in 64 countries—owning APIs that crunch 1 billion learning events & the AI that supports it with <200 ms latency.


Skip the wait and get noticed faster by completing our AI-powered screening. Click this link to start your quick interview. It only takes a few minutes and could be your shortcut to landing the job! -https://bit.ly/LT_Python


What You'll Do:


At LearnTube, we’re pushing the boundaries of Generative AI to revolutionize how the world learns. As a Backend Engineer, your roles and responsibilities will include:

  • Ship Micro-services – Build FastAPI services that handle ≈ 800 req/s today and will triple within a year (sub-200 ms p95).
  • Power Real-Time Learning – Drive the quiz-scoring & AI-tutor engines that crunch millions of events daily.
  • Design for Scale & Safety – Model data (Postgres, Mongo, Redis, SQS) and craft modular, secure back-end components from scratch.
  • Deploy Globally – Roll out Dockerised services behind NGINX on AWS (EC2, S3, SQS) and GCP (GKE) via Kubernetes.
  • Automate Releases – GitLab CI/CD + blue-green / canary = multiple safe prod deploys each week.
  • Own Reliability – Instrument with Prometheus / Grafana, chase 99.9 % uptime, trim infra spend.
  • Expose Gen-AI at Scale – Publish LLM inference & vector-search endpoints in partnership with the AI team.
  • Ship Fast, Learn Fast – Work with founders, PMs, and designers in weekly ship rooms; take a feature from Figma to prod in < 2 weeks.


What makes you a great fit?


Must-Haves:

  • 3+ yrs Python back-end experience (FastAPI)
  • Strong with Docker & container orchestration
  • Hands-on with GitLab CI/CD, AWS (EC2, S3, SQS) or GCP (GKE / Compute) in production
  • SQL/NoSQL (Postgres, MongoDB) + You’ve built systems from scratch & have solid system-design fundamentals

Nice-to-Haves

  • k8s at scale, Terraform,
  • Experience with AI/ML inference services (LLMs, vector DBs)
  • Go / Rust for high-perf services
  • Observability: Prometheus, Grafana, OpenTelemetry


About Us: 


At LearnTube, we’re on a mission to make learning accessible, affordable, and engaging for millions of learners globally. Using Generative AI, we transform scattered internet content into dynamic, goal-driven courses with:

  • AI-powered tutors that teach live, solve doubts in real time, and provide instant feedback.
  • Seamless delivery through WhatsApp, mobile apps, and the web, with over 1.4 million learners across 64 countries.


Meet the Founders: 


LearnTube was founded by Shronit Ladhani and Gargi Ruparelia, who bring deep expertise in product development and ed-tech innovation. Shronit, a TEDx speaker, is an advocate for disrupting traditional learning, while Gargi’s focus on scalable AI solutions drives our mission to build an AI-first company that empowers learners to achieve career outcomes. We’re proud to be recognised by Google as a Top 20 AI Startup and are part of their 2024 Startups Accelerator: AI First Program, giving us access to cutting-edge technology, credits, and mentorship from industry leaders.


Why Work With Us? 


At LearnTube, we believe in creating a work environment that’s as transformative as the products we build. Here’s why this role is an incredible opportunity:

  • Cutting-Edge Technology: You’ll work on state-of-the-art generative AI applications, leveraging the latest advancements in LLMs, multimodal AI, and real-time systems.
  • Autonomy and Ownership: Experience unparalleled flexibility and independence in a role where you’ll own high-impact projects from ideation to deployment.
  • Rapid Growth: Accelerate your career by working on impactful projects that pack three years of learning and growth into one.
  • Founder and Advisor Access: Collaborate directly with founders and industry experts, including the CTO of Inflection AI, to build transformative solutions.
  • Team Culture: Join a close-knit team of high-performing engineers and innovators, where every voice matters, and Monday morning meetings are something to look forward to.
  • Mission-Driven Impact: Be part of a company that’s redefining education for millions of learners and making AI accessible to everyone.
Read more
Vivanet
Mumbai
10 - 15 yrs
Best in industry
French
Business Analysis
AWS IAM
SQL
Data modeling
+16 more

Location: Mumbai

Type: Contract - 6+ months (can be extended based upon performance)

Mode: Work From Office (Full - Time)


Goals and deliverables


Summary: A new GIT platform will be created in our existing Captive in India (CASPL) and it will operate some activities of Digital Centre of Excellence (DEC)Digital Centre of Excellence (DEC) managed a transverse offer of digital products and delivers IT services through its centers of excellence. Activities encompass development of reusable components (building blocks), development and maintenance of business solutions that leverage on multiple expertise. It requires professionals having technical competencies and noticeable experience on critical services in the context of an investment bank. Working in DEC requires ability to extensively collaborate across geographies with other IT professionals and non-IT functions as well as a strong motivation on supporting the digital transformation of the bank.


Key Responsibilities

•Work with business and technical teams to gather and translate requirements into technical specifications for the development of software applications.

•Collaborate with the development team to ensure effective and efficient integration of back-end systems and front-end interfaces.

•Act as a bridge between stakeholders and the software development team, ensuring clarity and alignment throughout the development lifecycle.

•Comply with the software development life cycle (SDLC), project management methodology, business technology architecture, and risk and production capacity - including development of project documentation of system requirements, estimates of scope and cost.

•Plan, prepare and execute the qualification strategy and tests of the target solutions.

•Ensure quality assurance before allowing transition to the User Acceptance Testing (UAT) phase.

•Develop and manage technical documentation, including use cases, process flows, and functional specifications.

•Analyze and troubleshoot issues during development and post-deployment, offering solutions and optimizations.

•Support the development team in designing solutions that align best practices, ensuring system security and user access controls.

•Participate in Agile ceremonies such as sprint planning, stand-ups, and retrospectives to maintain development momentum.

•L3 support, troubleshooting of production incidents.


Communication

Key Internal Contacts

•GIT CASPL DEC – Operation manager

•GIT CASPL DEC – Squad leader

•GIT ISAP DEC – Product manager


Legal and Regulatory Responsibilities

•Comply with all applicable legal, regulatory and internal Compliance requirements, including, but not limited to, the local Compliance manual and the Financial Crime Policy. Complete any mandatory training in line with legal, regulatory and internal Compliance requirements.

•Maintain appropriate knowledge to ensure to be fully qualified to undertake the role. Complete all mandatory training as required to attain and maintain competence.

•Refrain from taking any steps which could lead to the removal of certification of fitness and properness to perform the role.

•Undertake all necessary steps to satisfy the annual certification process.

•Comply with all applicable conduct rules as prescribed by the relevant regulator.


ROLE REQUIREMENTS:

•Hands-on experience of coordination between the business and IT teams (various locations) and as primary functional support for development teams.

•Strong ability to interpret business needs and translate them into requirements.

•Knowledge in project cycle methodologies and management.

•Knowledge in software development life cycle and software testing life cycle.

•Experience in writing test documents and performing functional tests.

•Ability to anticipate risks.

•Identify and determine enhancements to improve business processes and to mitigate/reduce risks.

•Strong experience in working with Microsoft Office tools, JIRA.

•Proficient in SQL database queries.

•Ability to prepare dataset for analysis and validation purposes.


Qualifications/Education Required:


Education: Bachelor or better - Computer Science Technology, Physics or Math.


Experience Required: This position requires minimum 7 years of relevant experiences in software/system analysis supporting the development of enterprise-wide complex information systems at cyber security space within major organizations, ideally in a banking environment.


Specialist Training Required

Cyber security certification is a plus.

Approved Person Registration


Competencies Required:

Excellent communication in English, both verbal and written, for translating technical concepts to non-technical stakeholders.

• Strong analytical and problem-solving skills.

• Detail-oriented and organized, with the ability to manage multiple tasks and deadlines.

• Collaborative team player with a proactive approach to problem-solving.

• Proactive style of working, organizational skills

• Ability to multi-task and work independently with minimal supervision

• A passion for continuous learning and improvement in software development methodologies and best practices.


Skills & Knowledge Requirements

FRENCH SPEAKER

Domain experience in Identity and Access Management (IAM), including knowledge of IAM protocols (OAuth, SAML, LDAP, SSO).

• Knowledge in software development life cycle (SDLC)

• Knowledge in software quality assurance, planning and execution.

• Knowledge of Business Process Modelling Notation (BPMN)

• Exceptional understanding of data modelling (Entity Relation Diagram).

• Knowledge of Microsoft Office tools, JIRA (Agile, Kanban).

• Knowledge of relational databases.


Expected skills

IAM protocol - Confirmed

Data modelling - Expert

Software development life cycle - Confirmed

Business process modelling Notation - Confirmed

SAML - Confirmed

Languages

French

Read more
SDS softwares

at SDS softwares

2 candid answers
1 recruiter
Tanavee Sharma
Posted by Tanavee Sharma
Remote only
0.6 - 0.8 yrs
₹0.8L - ₹0.9L / yr
Business Analysis
PowerBI
BRD
Tableau
MS-Excel
+5 more

Job Title: Business Analyst (BA)

Job Type: Full-Time | Remote | 5 Days Working

Salary: ₹7,000 – ₹8,000 per month

Experience Required: 6 months to 1 year (Freshers with internship experience can apply)

Joining: Immediate Joiners Only

About the Role:

We are looking for freshers who have strong foundational skills and knowledge in both Business Analysis. This is a position where you will be responsible for manually handling tasks related to both business testing functions.

Key Responsibilities:

  • Gather and analyze business requirements from stakeholders
  • Create documentation such as BRDs, FRDs, user stories, and process flows
  • Perform manual testing of software applications
  • Prepare test cases, test plans, and report bugs clearly
  • Collaborate with development and business teams to ensure product quality and requirement clarity
  • Provide timely updates and reports on progress and findings

Requirements:

  • Must have skills and knowledge in Business Analysis
  • Must be able to manage both roles manually and independently
  • Proficiency in tools related to BA
  • Excellent communication skills in English (spoken and written)
  • Must have a personal laptop and a stable internet connection
  • Must be available to join immediately

Who Should Apply:

  • Freshers with 6 months to 1 year of experience in relevant roles
  • Candidates who are confident in handling BA
  • Individuals looking to build a strong foundation in both domains in a remote, full-time role


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Mumbai, Bengaluru (Bangalore)
4 - 6 yrs
₹3L - ₹11L / yr
skill icon.NET
ASP.NET
skill iconC#
skill iconDocker
Microservices
+1 more

🚀 Hiring: .NET Develoepr at Deqode

⭐ Experience: 4+ Years

📍 Location: Mumbai and Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)



We are looking for a skilled .NET Developer to design and develop scalable microservices and enterprise-grade applications. The role involves building secure REST APIs, writing clean and testable code, working with Docker-based deployments, and collaborating with cross-functional teams.


Key Responsibilities:

  • Develop .NET Core microservices
  • Build and secure REST APIs
  • Write unit & integration tests
  • Deploy applications using Docker
  • Ensure performance optimization and code quality


3 Mandatory Skills

  1. .NET Core / ASP.NET Core Web API
  2. Microservices & Docker
  3. REST API development with Unit Testing





Read more
BigThinkCode Technologies
Kumar AGS
Posted by Kumar AGS
Chennai
4 - 6 yrs
₹1L - ₹13L / yr
SQL
Data modeling
Pipeline management
Apache
Google BigQuery

At BigThinkCode, our technology solves complex problems. We are looking for talented Data engineer to join our Data team at Chennai.

 

Our ideal candidate will have expert knowledge of software development processes, programming, and problem-solving skills. This is an opportunity to join a growing team and make a substantial impact at BigThinkCode Technologies.

 

Please see below our job description, if interested apply / reply sharing your profile to connect and discuss.

 

Company: BigThinkCode Technologies

URL: https://www.bigthinkcode.com/

Work location: Chennai (work from office)

Experience required: 4 - 6 years

Work location: Chennai

Joining time: Immediate – 4 weeks

Work Mode: Work from office (Hybrid)

 

Job Overview:

We are seeking a skilled Data Engineer to design, build, and maintain robust data pipelines and infrastructure. You will play a pivotal role in optimizing data flow, ensuring scalability, and enabling seamless access to structured/unstructured data across the organization. The ideal candidate will design, build, and optimize scalable data pipelines with strong SQL proficiency, data modelling expertise.

Key Responsibilities:

  • Design, develop, and maintain scalable pipelines to process structured and unstructured data.
  • Optimize and manage SQL queries for performance and efficiency in large-scale datasets.
  • Experience working with data warehouse solutions (e.g., Redshift, BigQuery, Snowflake) for analytics and reporting.
  • Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions.
  • Experience in Implementing solutions for streaming data (e.g., Apache Kafka, AWS Kinesis) is preferred but not mandatory.
  • Ensure data quality, governance, and security across pipelines and storage systems.
  • Document architectures, processes, and workflows for clarity and reproducibility.

Required Technical Skills:

  • 4 or more years of experience in Data Engineering file.
  • Expertise in SQL (complex queries, optimization, and database design).
  • Write optimized and production-grade SQL scripts for transformations and data validation.
  • Solid understanding and hands on experience in creating data pipelines and patterns.
  • Proficiency in any programming languages like Python or R for scripting, automation, and pipeline development.
  • Hands-on experience with Google Bigquery and Apache Airflow.
  • Experience working on any cloud-based platforms like AWS or GCP or Azure.
  • Experience working with structured data (RDBMS) and unstructured data (JSON, Parquet, Avro).
  • Familiarity with cloud-based data warehouses (Redshift, BigQuery, Snowflake).
  • Knowledge of version control systems (e.g., Git) and CI/CD practices.

Why Join Us:

·      Collaborative work environment.

·      Exposure to modern tools and scalable application architectures.

·      Medical cover for employee and eligible dependents.

·      Tax beneficial salary structure.

·      Comprehensive leave policy

·      Competency development training programs.

 

 

Read more
Remote, Noida, Gurugram, Pune, Nagpur, Jaipur, Gandhinagar
8 - 14 yrs
₹12L - ₹18L / yr
skill iconPython
SQL
PySpark
databricks
Snow flake schema
+6 more

Senior Data Engineer (Databricks, BigQuery, Snowflake)

Experience: 8+ Years in Data Engineering

Location: Remote | Onsite (Noida, Gurgaon, Pune, Nagpur, Jaipur, Gandhinagar)

Budget: Open / Competitive


Job Summary:

We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable data solutions that support advanced analytics and machine learning initiatives. You will lead the development of reliable, high-performance data systems and collaborate closely with data scientists to enable data-driven decision-making.

In this role, we expect a forward-thinking professional who utilizes AI-augmented development tools (such as Cursor, Windsurf, or GitHub Copilot) to increase engineering velocity and maintain high code standards in a modern enterprise environment.


Key Responsibilities:

  • Scalable Pipelines: Design, develop, and optimize end-to-end data pipelines using SQL, Python, and PySpark.
  • ETL/ELT Workflows: Build and maintain workflows to transform raw data into structured, analytics-ready datasets.
  • ML Integration: Partner with data scientists to deploy and integrate machine learning models into production environments.
  • Cloud Infrastructure: Manage and scale data infrastructure within AWS and Azure ecosystems.
  • Data Warehousing: Utilize Databricks and Snowflake for big data processing and enterprise warehousing.
  • Automation & IaC: Implement workflow orchestration using Apache Airflow and manage infrastructure as code via Terraform.
  • Performance Tuning: Optimize data storage, retrieval, and system performance across data warehouse platforms.
  • Governance & Compliance: Ensure data quality and security using tools like Unity Catalog or Hive Metastore.
  • AI-Augmented Development: Integrate AI tools and LLM APIs into data pipelines and use AI IDEs to streamline debugging and documentation.


Technical Requirements:

  • Experience: 8+ years of core Data Engineering experience in large-scale enterprise or consulting environments.
  • Languages: Expert proficiency in SQL and Python for complex data processing.
  • Big Data: Hands-on experience with PySpark and large-scale distributed computing.
  • Architecture: Strong understanding of ETL frameworks, data pipeline architecture, and data warehousing best practices.
  • Cloud Platforms: Deep working knowledge of AWS and Azure.
  • Modern Tooling: Proven experience with Databricks, Snowflake, and Apache Airflow.
  • Infrastructure: Experience with Terraform or similar IaC tools for scalable deployments.
  • AI Competency: Proficiency in using AI IDEs (Cursor/Windsurf) and integrating AI/ML models into production data flows.


Preferred Qualifications:

  • Exposure to data governance and cataloging tools (e.g., Unity Catalog).
  • Knowledge of performance tuning for massive-scale big data systems.
  • Familiarity with real-time data processing frameworks.
  • Experience in digital transformation and sustainability-focused data projects.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Meghana Shinde
Posted by Meghana Shinde
Pune
8 - 12 yrs
Best in industry
Business Analysis
Risk Management
BRD
FRD
SQL

Job Description: Business Analyst (Capital Markets / Investment Management)

Position Summary

We are seeking an experienced Business Analyst with strong techno-functional expertise in Capital Markets, Investment Banking, Asset Management, and Risk Management. The ideal candidate will have hands-on experience across the full trade lifecycleUAT leadershiprisk frameworksFIX protocol, and digital transformation initiatives. This role requires close collaboration with front, middle, and back-office stakeholders, IT teams, and external vendors to deliver critical business and regulatory solutions.

Key Responsibilities

Business Analysis & Requirements Management

· Lead requirements gathering, documentation (BRD, FRD, User Stories), workflow mapping, and gap analysis.

· Conduct JAD sessions with Trading Desks, Portfolio Managers, Risk, Compliance, and Technology teams.

· Translate business requirements into detailed functional specifications and acceptance criteria.

· Manage and prioritize product backlogs using Agile/Scrum methodologies.

Trade Lifecycle & Capital Markets Expertise

· Support end-to-end trade flows across Equities, Derivatives, Fixed Income, Forex, Options, ETFs, Private Equity, and Structured Products.

· Validate front-to-back trade processes including order placement, execution, allocations, settlement, reconciliation, and reporting.

· Work with OMS/EMS platforms, market connectivity, and brokerage systems.

Risk Management (Market, Model, Liquidity, Credit)

· Analyze VaR, stress testing, scenario analysis, exposure calculations, and liquidity metrics (LCR/NSFR).

· Contribute to market risk policy formulation, governance, and regulatory compliance.

· Identify risk hotspots, process gaps, and control weaknesses with actionable remediation plans.

· Support regulatory reporting including Mark-to-Market and Notional Change requirements.

UAT, QA & Testing Leadership

· Lead end-to-end UAT cycles for trading, risk, and investment applications.

· Create test plans, test cases, and defect logs; track issues through JIRA until closure.

· Perform regression, functional, and production validation testing.

· Coordinate with QA, development teams, and Front Office for seamless deployment.

FIX Protocol & System Integrations

· Gather and validate FIX requirements for OMS/EMS integration.

· Support FIX message mapping, configuration, certification, and UAT.

· Collaborate with brokers, exchanges, and internal development teams for connectivity and workflow enhancements.

Client Management & Onboarding (Buy-side/Sell-side)

· Manage onboarding for clients such as Hedge Funds, Family Offices, Asset Managers, and Prime Brokers.

· Conduct requirement workshops, product demos, trainings, and post-implementation support.

· Serve as the primary point of contact for issue resolution, escalations, and enhancement discussions.

Project & Stakeholder Management

· Drive project plans, milestones, and sprint activities (Planning, Grooming, Stand-ups, Retrospectives).

· Ensure alignment between business needs and technology delivery.

· Prepare executive-level dashboards, presentations, and risk summaries for senior stakeholders.

Skills & Competencies

Technical Skills

· Tools & Platforms: Bloomberg, Refinitiv, FactSet, BlackRock Aladdin, Robinhood, IRIS, Falcon

· Databases: SQL, Excel (advanced), data reconciliation tools

· Project Tools: JIRA, Monday.com, Confluence, MS Visio, Axure

· Risk Systems: VAR models, stress testing tools, exposure monitoring systems

Core Competencies

· Strong stakeholder management & communication

· Business rules analysis & functional documentation

· UI/UX requirement mapping

· Data migration & system integration

· Analytical thinking & problem-solving

· Cross-functional collaboration

Qualifications

· 

9+  years of experience in Capital Markets, Investment Management, and Trading/Risk Systems.

· 

· MBA Finance (preferred) / BBA Finance.

· Certifications:

o NISM – Equity, Derivatives, Options Strategies

o CFI – Fixed Income Fundamentals

o Microsoft – Career Essentials in Business Analysis

o FRM (GARP) – Pursuing

Preferred Experience

· Working on end-to-end trading platform implementations.

· Exposure to Hedge Funds, PMS, AIF, Private Equity, and Wealth Management workflows.

· Knowledge of regulatory frameworks (Basel II–IV, SEBI, Risk Governance).

· Experience authoring policies, SOPs, and process documentation.

Soft Skills

· Excellent verbal and written communication.

· Strong analytical and quantitative capabilities.

· Ability to translate technical concepts to business stakeholders.

· High ownership, deadline orientation, and team collaboration skills.

 

 

Read more
Zethic Technologies

at Zethic Technologies

1 recruiter
Pooja G
Posted by Pooja G
Remote only
18 - 22 yrs
₹18L - ₹22L / yr
skill iconPython
Artificial Intelligence (AI)
Large Language Models (LLM)
RESTful APIs
skill iconDjango
+4 more

Python Developer (AI Integration Focus) – Junior :

Technical Requirements • Strong fundamentals in Python

* Experience building REST APIs

* Familiarity with:

o FastAPI / Flask / Django

o JSON, async programming basics

* Basic understanding of:

o LLM APIs (Azure OpenAI or equivalent)

o Prompt-based integrations

o Prompt Engineering

* Exposure to:

o Git and CI/CD pipelines

o Azure cloud fundamentals

* Basic database knowledge (SQL / NoSQL)

Core Engineering

* Advanced proficiency in Python

* Strong experience in:

o FastAPI / Django

o Async programming

o Event-driven architectures

o Microservices design

* Experience with:

o Azure/AWS cloud services

o Containerization (Docker, Kubernetes)

o API management / gateway design


AI & Agentic Capabilities

* Strong understanding of:

o LLM ecosystems (Claude, GPT, Gemini)

o LLM integration patterns

o Prompt engineering (few-shot, structured prompting, chaining)

o Tool invocation frameworks

* Experience with:

o Agentic frameworks and orchestration

o Workflow coordination across multiple AI services

o RAG architectures and patterns

o Vector databases

* Familiarity with:

o MCP connectors or contextual integration frameworks


Enterprise Integration

* Experience integrating AI layers with legacy enterprise systems

* Strong understanding of:

o API scalability

o Distributed system resilience


o High-availability architectures

Read more
Bengaluru (Bangalore)
5 - 10 yrs
₹1L - ₹10L / yr
databricks
PySpark
Apache Spark
ETL
CI/CD
+10 more

Profile - Databricks Developer

Experience- 5+ years

Location- Bangalore (On site)

PF & BGV is Mandatory


Job Description: -

* Design, build, and optimize data pipelines and ETL/ELT workflows using Databricks and

Apache Spark (PySpark).

* Develop scalable, high performance data solutions using Spark distributed processing.

* Lead engineering initiatives focused on automation, performance tuning, and platform

modernization.

* Implement and manage CI/CD pipelines using Git-based workflows and tools such as

GitHub Actions or Jenkins.

* Collaborate with cross-functional teams to translate business needs into technical

solutions.

* Ensure data quality, governance, and security across all processes.

* Troubleshoot and optimize Spark jobs, Databricks clusters, and workflows.

* Participate in code reviews and develop reusable engineering frameworks.

* Should have knowledge of utilizing AI tools to improve productivity and support daily

engineering activities.

* Strong knowledge and hands-on experience in Databricks Genie, including prompt

engineering, workspace usage, and automation.

Required Skills & Experience:

* 5+ years of experience in Data Engineering or related fields.

* Strong hands-on expertise in Databricks (notebooks, Delta Lake, job orchestration).

* Deep knowledge of Apache Spark (PySpark, Spark SQL, optimization techniques).

* Strong proficiency in Python for data processing, automation, and framework

development.

* Strong proficiency in SQL, including complex queries, performance tuning, and analytical

functions.

* Strong knowledge of Databricks Genie and leveraging it for engineering workflows.

* Strong experience with CI/CD and Git-based development workflows.

* Proficiency in data modeling and ETL/ELT pipeline design.


* Experience with automation frameworks and scheduling tools.

* Solid understanding of distributed systems and big data concepts

Read more
Gradera AI Technologies
Sirisha Jonnada
Posted by Sirisha Jonnada
Hyderabad
4 - 7 yrs
₹20L - ₹50L / yr
skill iconPython
SQL
databricks

Role & Responsibilities

 

·      Collect, clean, and analyze large structured and unstructured datasets from multiple internal and external sources

·      Conduct thorough exploratory data analysis (EDA) to understand data distributions, relationships, outliers, and missing value patterns

·      Profile and audit datasets to assess data quality, completeness, consistency, and fitness for modeling

·      Investigate and document data lineage — understanding where data originates, how it flows, and how it transforms across systems

·      Identify and resolve data anomalies, inconsistencies, and integrity issues in collaboration with data engineering teams

·      Develop a deep understanding of the business domain and the underlying data that represents it — including what each field means, how it is captured, and what its limitations are

·      Translate raw, messy, real-world data into clean, well-understood analytical datasets ready for modeling and reporting

·      Apply statistical techniques such as correlation analysis, hypothesis testing, variance analysis, and distribution fitting to extract meaningful signals from noise

·      Build and deploy machine learning models including regression, classification, clustering, NLP, and time-series analysis

·      Design, evaluate, and analyze A/B experiments and controlled tests using causal inference techniques

·      Develop data-driven recommendations backed by rigorous statistical reasoning

·      Write clean, production-ready code in Python or R

·      Collaborate with data engineers to build reliable data pipelines and feature stores

·      Deploy and monitor ML models using MLOps best practices on cloud infrastructure

·      Build dashboards and self-serve analytics tools to support stakeholder decision-making

 

Data Understanding & Analysis Skills

 

·      Strong ability to interrogate unfamiliar datasets and quickly develop a working understanding of their structure, semantics, and quirks

·      Experience working with messy, incomplete, or poorly documented real-world data

·      Skilled in identifying hidden patterns, trends, seasonality, and anomalies through visual and statistical exploration

·      Ability to ask the right questions about data — challenging assumptions, validating sources, and understanding the context in which data was collected

·      Proficiency in data profiling, descriptive statistics, and summary reporting to communicate the shape and health of a dataset

·      Experience creating data dictionaries, documentation, and data quality reports to support team-wide data understanding

·      Comfort working across structured (relational tables), semi-structured (JSON, XML), and unstructured (text, logs, sensor streams) data formats

 

Technical Skills Required

 

·      Proficiency in Python (pandas, NumPy, scikit-learn, PyTorch or TensorFlow) and/or R

·      Strong SQL skills with hands-on experience in DB2 and SQL Server

·      Experience with Databricks for large-scale data processing, feature engineering, and model training

·      Familiarity with cloud platforms: Azure or AWS

·      Experience with data warehouses and big data platforms (Databricks, Snowflake, or Redshift)

·      Knowledge of MLOps tools such as MLflow, Kubeflow, or Airflow

·      Experience with streaming data technologies such as Kafka or Spark

·      Solid foundation in probability, statistics, linear algebra, and experimental design

 

Nice to Have

 

·      Experience with deep learning, NLP, computer vision, or Bayesian methods

·      Familiarity with real-time or streaming data pipelines

·      Open-source contributions or published research

Read more
Global MNC serving 40+ Fortune 500 Companies

Global MNC serving 40+ Fortune 500 Companies

Agency job
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹26L / yr
Generative AI
Retrieval Augmented Generation (RAG)
skill iconMachine Learning (ML)
LangGraph
langchain
+11 more

Want to work on exciting GenAI projects for Fortune 500 companies across multiple sectors? Then read on..


About Company:

CSG is a multi-national company having a presence in 20 countries with 1600+ Engineers. Company works with more than 40 Fortune 500 customers such as Sony, Samsung, ABB, Thyssenkrup, Toyota, Mitsubishi and many more.


Job Description:

We are looking for a talented Generative AI Developer to join our dynamic AI/ML team. This position offers an exciting opportunity to leverage cutting-edge Generative AI (GenAI) technologies to drive innovation to solve real world problems. You will be responsible for developing and optimizing GenAI-based applications, implementing advanced techniques like Retrieval-Augmented Generation (RAG), RIG (Retrieval Interleaved Generation), Agentic Frameworks and vector databases. This is a collaborative role where you will work directly with customers cross-functional teams to design, implement, and optimize AI-driven solutions. Exposure to cloud-native AI platforms such as Amazon Bedrock and Microsoft Azure OpenAI is highly desirable.


Key Responsibilities

Generative AI Application Development:

Design, develop, and deploy GenAI-driven applications to address complex industrial challenges.

Implement Retrieval-Augmented Generation (RAG) and Agentic frameworks


Data Management & Optimization:

Design and optimize document chunking strategies tailored to specific datasets and use cases.

Build, manage, and optimize data embeddings for high-performance similarity searches across vector databases.


Collaboration & Integration:

Work closely with data engineers and scientists to integrate AI solutions into existing pipelines.

Collaborate with cross-functional teams to ensure seamless AI implementation.


Cloud & AI Platform Utilization:

Explore and implement best practices for utilizing cloud-native AI platforms, such as Amazon Bedrock and Azure OpenAI, to enhance solution delivery.

Continuous Learning & Innovation:

Stay updated with the latest trends and emerging technologies in the GenAI and AI/ML fields, ensuring our solutions remain cutting-edge.


Requirements:

The ideal candidate will have strong experience in Generative AI technologies, particularly in the areas of RAG, document chunking, and vector database management. They will be able to quickly adapt to evolving AI frameworks and leverage cloud-native platforms to create efficient, scalable solutions. You will be working in a fast-paced and collaborative environment, where innovation and the ability to learn and grow are key to success.

- 3 to 5 years of overall experience in software development, with 3 years focused on AI/ML.

- Minimum 2 years of experience specifically working with Generative AI (GenAI) technologies.

- Python, PySpark and SQL knowledge is necessary for tasks

- Proven ability to work in a collaborative, fast-paced, and innovative environment.


Technical Skills:

- Generative AI Frameworks & Technologies:

- Expertise in Generative AI frameworks, including prompt engineering, fine-tuning, and few-shot learning.

- Familiarity with frameworks such as T5 (Text-to-Text Transfer Transformation), LangChain, Lang Graph, Open-source tech stalk Ollama, Mistral, DeepSeek.

- Strong knowledge of Retrieval-Augmented Generation (RAG) for combining LLMs with external data retrieval systems.


Data Management:

- Experience in designing chunking strategies for different datasets.

- Expertise in data embedding techniques and experience with vector databases like Pinecone, ChromaDB etc

- Programming & AI/ML Libraries:

- Strong programming skills in Python.

- Experience with AI/ML libraries such as TensorFlow, PyTorch, and Hugging Face Transformers.


Cloud Platforms & Integration:

- Familiarity with cloud services for AI/ML workloads (AWS, Azure).

- Experience with API integration for AI services and building scalable applications.

- Certifications (Optional but Desirable):

- Certification in AI/ML (e.g., TensorFlow, AWS Certified Machine Learning Specialty).

- Certification or coursework in Generative AI or related technologies.

Read more
Thingularity

Thingularity

Agency job
via Thomasmount Consulting by Shirin Shahana
Bengaluru (Bangalore)
4 - 8 yrs
₹18L - ₹20L / yr
skill iconPython
SQL
ETL

Job Summary

We are seeking a skilled Data Engineer with 4+ years of experience in building scalable data pipelines and working with modern data platforms. The ideal candidate should have strong expertise in Python, SQL, and cloud-based data solutions, with hands-on experience in ETL/ELT processes and data warehousing.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Python
  • Develop and optimize ETL/ELT workflows for data ingestion and transformation
  • Work with structured and unstructured data from multiple sources
  • Build and manage data warehouses/data lakes
  • Perform data validation, cleansing, and quality checks
  • Optimize SQL queries and improve data processing performance
  • Collaborate with data analysts, data scientists, and business teams
  • Implement data governance, security, and best practices
  • Monitor pipelines and troubleshoot production issues

Required Skills

  • Strong programming experience in Python (Pandas, NumPy, PySpark preferred)
  • Excellent SQL skills (joins, window functions, performance tuning)
  • Experience with ETL tools like Informatica, Talend, or DBT
  • Hands-on experience with cloud platforms (Azure / AWS / GCP)
  • Experience in data warehousing solutions like Snowflake, Redshift, BigQuery
  • Knowledge of workflow orchestration tools like Apache Airflow
  • Familiarity with version control tools like Git

Preferred Skills

  • Experience with Big Data technologies (Spark, Hadoop)
  • Knowledge of streaming tools like Kafka
  • Exposure to CI/CD pipelines and DevOps practices
  • Experience in data modeling (Star/Snowflake schema)
  • Understanding of APIs and data integration


Read more
Mumbai, thane, Navi Mumbai
3 - 10 yrs
₹1L - ₹8L / yr
PLC
PLC Scada
SCADA
HMI
Pharmaceutics
+5 more

Engineer – Senior Level

Senior: Ghatkopar

Department: Automation / Programming


About the Opportunity

We are hiring Automation Engineers to work on end-to-end industrial automation projects in pharma

and food processing industries, involving PLC, HMI, and SCADA systems from design to

commissioning.

Qualification

Degree or Diploma in:

 Mechanical Engineering

 Electronics Engineering

 Instrumentation Engineering

 Electrical Engineering

Required Skills & Competencies

 Hands-on experience in PLC, HMI, and SCADA programming

 Knowledge of industrial automation in pharma/process industries

 Basic understanding of electrical & instrumentation wiring

 Ability to read and interpret technical drawings and schematics

 Experience in programming languages such as .NET, VB/VB.Net, SQL/T-SQL (preferred)

 Familiarity with AutoCAD Electrical, EPLAN, or similar tools (added advantage)

 Strong problem-solving and analytical skills

 Good communication and interpersonal skills

 Ability to work independently and within a team

 Flexible to travel and work extended hours when required

Key Responsibilities

 Program, test, and commission industrial control systems

 Select appropriate PLC, HMI, and SCADA systems based on customer URS

 Develop I/O lists as per P&ID and project requirements

 Design and implement control logic for automation projects

 Manage project timelines and ensure timely execution

 Coordinate with project managers on scope changes and updates

 Support FAT (Factory Acceptance Testing) and commissioning activities

 Interpret electrical schematics, wiring diagrams, and P&ID drawings

 Assist in troubleshooting electrical and instrumentation systems

 Ensure smooth project execution through effective coordination

Read more
IT Path Solutions
IT Path HR
Posted by IT Path HR
Ahmedabad
2 - 3 yrs
₹4L - ₹5L / yr
skill iconPython
Large Language Models (LLM)
skill iconDjango
skill iconFlask
FastAPI
+11 more

Required Skills:

  • Strong proficiency in Python
  • Experience with Django, Flask, or FastAPI
  • Solid understanding of REST APIs and backend architecture
  • Hands-on experience integrating LLM APIs (e.g., OpenAI, Anthropic) into applications
  • Familiarity with AI/LLM frameworks such as LangChain or LlamaIndex
  • Understanding of Retrieval-Augmented Generation (RAG), embeddings, and semantic search concepts
  • Experience or exposure to vector databases like Pinecone, Weaviate, or FAISS
  • Experience with databases (MySQL, PostgreSQL, MongoDB)
  • Familiarity with Git and version control workflows
  • Understanding of asynchronous programming and performance optimization
  • Basic knowledge of cloud platforms (AWS, GCP, or Azure)
  • Strong problem-solving and analytical skills


Read more
Source One
Deepali Khandelwal
Posted by Deepali Khandelwal
Pune
0 - 1 yrs
₹25000 - ₹35000 / mo
Python
Java
Jasmine (Javascript Testing Framework)
SQL
Agile testing
+4 more

About the Role

We are looking for curious, technically strong, and product-minded Product Engineer Interns to join our team. This internship offers a unique opportunity to work at the intersection of product thinking and software development, giving you hands-on exposure to the full product lifecycle.

As a Product Engineer Intern, you will collaborate with product and engineering teams to understand customer needs, contribute to feature development, and help deliver impactful product solutions. This role is ideal for students or recent graduates who want to build both technical expertise and product understanding in a fast-paced environment.


Key Responsibilities

Product Understanding (Why & What)

  • Assist in conducting customer and market research to understand user pain points and industry trends
  • Support in translating business needs into user stories and functional requirements
  • Help maintain product documentation and feature requirements
  • Assist in tracking product performance metrics and gathering feedback for improvements
  • Participate in brainstorming sessions for product enhancements

Software Development (How)

  • Support development of product features across web, backend, or internal tools
  • Write clean, maintainable, and efficient code under guidance from senior engineers
  • Participate in testing, debugging, and resolving technical issues
  • Contribute to code reviews and technical discussions
  • Help monitor product performance and support issue resolution


Qualifications & Skills

Required

  • Pursuing or recently completed a Bachelor’s degree in Computer Science, IT, Software Engineering, or related field
  • Strong understanding of programming fundamentals, data structures, and algorithms
  • Knowledge of at least one programming language such as Python, Java, JavaScript, or Go
  • Strong problem-solving and analytical thinking skills
  • Good verbal and written communication skills
  • Eagerness to learn in a fast-paced environment
  • Interest in building products that solve real customer problems

Preferred

  • Familiarity with Git/version control
  • Basic understanding of SQL/NoSQL databases
  • Exposure to cloud platforms like Amazon Web Services, Microsoft Azure, or Google Cloud
  • Understanding of Agile/Scrum methodology
  • Personal, academic, or internship projects demonstrating product thinking


Why Join Us?

  • Hands-on Learning: Work on real product features from day one
  • Mentorship: Learn directly from experienced product and engineering leaders
  • Growth: Build skills in both product management and software engineering
  • Impact: Contribute to solutions that directly improve customer experience
  • Collaborative Culture: Work in a learning-focused, innovative environment


Read more
IDEA ELAN

at IDEA ELAN

1 recruiter
RaginiNaidu Kamineni
Posted by RaginiNaidu Kamineni
Remote only
4.5 - 7.5 yrs
₹15L - ₹20L / yr
ASP.NET
SQL
NOSQL Databases
API
Team Management
+2 more

Backend Developer (4.5 – 7.5 Years Experience)


Company Description:

Idea Elan LLC is a product based company that provides comprehensive software solutionsfor

research facilities in Universities and Institutions worldwide.

Please visit www.IdeaElan.com for more information.


Key Responsibilities:

● Design and develop high-performance,scalable, and secure backend APIs and services

using .NET Core.

● Work withrelational (MS-SQL) andNoSQL (CosmosDB, MongoDB) databases to create

optimized data models and ensure data consistency and performance.

● Participate in code reviews and provide constructive feedback.

● Collaborate with front-end developers and other teams to deliver high-quality software.

● Write clean, maintainable, and efficient code while ensuring quality standards.

● Troubleshoot and debug complex issues, optimizing code for maximum performance and scalability.

● Stay updated with the latest trends in backend development and cloud technologies to drive innovation.

● Optimize database performance and ensure data integrity.


Required Experience:

● 4.5 -7.5 years of experience in backend development.

● Strong experience with .NET Core and building RESTful APIs.

● Proficient with MS-SQL and experience working with NoSQL databases like Cosmos DB and MongoDB.

● Hands-on experience with Azure Cloud services (e.g., Azure Functions, Azure Storage, API Management, Azure SQL Database,etc.).

● Understanding of software development principles such as object-oriented programming (OOP), design patterns, and SOLID principles.

● Experience with version control systems such as Git.

● Strong knowledge of asynchronous programming, microservices architecture, and cloud-native application design.

● Familiarity with CI/CD pipelines, containerization (Docker), and deployment automation is a plus.

● Excellent problem-solving and debugging skills.

● Ability to work in an Agile development environment and collaborate with cross-functional teams.

● Good communication and collaboration skills.

Read more
Remote only
3 - 15 yrs
₹8L - ₹12L / yr
FastAPI
skill iconPython
RESTful APIs
SQL
NOSQL Databases
+5 more

Summary:

We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.

Job Type:

Full-Time Contractor (12 months)

Location:

Remote

Experience:

3+ years in backend development

Key Responsibilities:

  • Design, develop, and maintain robust backend services using Python and FastAPI.
  •  Implement and manage Prisma ORM for database operations.
  • Build scalable APIs and integrate with SQL databases and third-party services.
  • Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
  • Collaborate with front-end developers and other team members to deliver high-quality web applications.
  • Ensure application performance, security, and reliability.
  • Participate in code reviews, testing, and deployment processes.

Required Skills:

  • Expertise in Python backend development with strong experience in FastAPI.
  • Solid understanding of RESTful API design and implementation.
  • Proficiency in SQL databases and ORM tools (preferably Prisma)
  • Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
  • Familiarity with CI/CD pipelines and containerization (Docker).
  • Knowledge of cloud architecture best practices.

Added Advantage:

  • Front-end development knowledge (React, Angular, or similar frameworks).
  • Exposure to AWS/GCP cloud platforms.
  • Experience with NoSQL databases.

Eligibility:

  • Minimum 3 years of professional experience in backend development.
  • Available for full-time engagement.
  • Please excuse if you are currently engaged in other projects—we require dedicated availability.
Read more
ONEPOS RETAIL SOLUTIONS PVT LTD
Durga Prasad C
Posted by Durga Prasad C
Bengaluru (Bangalore), Tirupati
2 - 5 yrs
₹3L - ₹6L / yr
Model-View-View-Model (MVVM)
skill iconKotlin
skill iconAndroid Development
jetpack
Database migration
+5 more

About the Role

We are looking for a passionate Native Android Developer with strong expertise in Kotlin and modern Android development. The ideal candidate must have hands-on experience with Coroutines & Flow, along with Jetpack componentsXML UI, and Room Database.

What You’ll Do

  • Develop and maintain Android applications using Kotlin
  • Build responsive UI using XML layouts
  • Implement modern architecture using Jetpack components (ViewModel, StateFlow/Flow, Navigation)
  • Design and manage local databases using Room
  • Build reactive and scalable apps using Coroutines & Flow
  • Work on offline-first architecture and sync strategies
  • Integrate REST APIs and collaborate with backend teams
  • Optimize app performance for low-memory devices
  • Debug, test, and improve application stability

Must-Have Skills

  • Strong experience in Kotlin & Android SDK
  • Mandatory: Coroutines & Flow (StateFlow / SharedFlow)
  • Hands-on experience with:
  • Jetpack (ViewModel, Navigation, WorkManager)
  • XML UI Design
  • Room Database (Entity, DAO, Migration)
  • Solid understanding of MVVM architecture
  • Experience with REST APIs & Git

Good to Have

  • Experience with Jetpack Compose
  • Knowledge of Firebase (Crashlytics, Analytics)
  • Experience in POS / enterprise applications

Qualification

  • Any Graduate.
  • 2- 4 years of experience in Android development.


Regards,

Durga Prasad

Read more
Vivanet

at Vivanet

1 candid answer
Ashish Uikey
Posted by Ashish Uikey
Remote only
8 - 12 yrs
Best in industry
RBAC
Microsoft Windows Azure
Integration
CI/CD
skill iconGitHub
+5 more

Job Title: Snowflake Platform Administrator

Duration: 6-12 months contract (could be extended upon performance)

Mode: Remote


About the Role

We are looking for a Snowflake Administrator to join our Snowflake Center of Excellence (COE) to manage, secure, and optimize the enterprise Snowflake data platform. The role will focus on platform administration, security governance, and automation while enabling data engineering, analytics, and business teams to effectively leverage Snowflake capabilities.


Key Responsibilities

Administer and maintain the Snowflake platform, including warehouses, databases, schemas, users, roles, and resource monitors.

•Implement and manage Snowflake security and access governance including RBAC, network policies, and network rules.

•Manage identity and access integration with Azure Active Directory (Azure AD), including role mapping with Azure AD groups.

Monitor platform performance, usage, and cost to ensure efficient and reliable operations.

•Manage key Snowflake capabilities including data sharing (consumer and provider), cloning, data recovery, integrations (storage/API/notification), and performance optimization.

•Develop automation scripts using SQL and Python for administrative and operational tasks.

•Create and maintain CI/CD workflows using GitHub Actions for Snowflake deployments.

•Collaborate with data engineers, analysts, and architects to ensure secure and scalable data platform usage.

•Stay up to date with Snowflake product releases, new features, and platform best practices, and proactively evaluate their applicability to the organization.

•Contribute to standards, best practices, and governance frameworks within the Snowflake COE.

General Business

•Explore opportunities to leverage AI to improve platform automation and productivity.


Required Experience & Skills:

5-8 years of relevant experience in Snowflake Administration and platform management.

•Solid understanding of Snowflake architecture, security, features, and performance optimization.

•Experience implementing RBAC, Network Policies, and Network Rules in Snowflake.

•Experience with Snowflake integration with Azure AD for role and access management via AD groups.

•Proficiency in SQL and Python scripting.

•Experience with GitHub and GitHub Actions/Workflow creation.

•Strong analytical and problem-solving skills.

•Functional Domain: FMCH (Fast Moving Consumer Health).


Preferred Additional Skills:

•AI enthusiast and Automation expertise

•Understanding of modern data architectures including data lakes and real-time processing

•Familiarity with BI tools such as Power BI, Tableau, Looker


Education & Languages

•Bachelor’s degree in computer science, Information Technology, or similar quantitative field of study.

•Fluent in English.

•Function effectively within teams of varied cultural backgrounds and expertise sources.

Read more
ZakApps software pvt ltd
SindhuPriyaa Arun
Posted by SindhuPriyaa Arun
Chennai
4 - 8 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Microservices
Infrastructure management
skill iconKubernetes
+1 more
  • Bachelor’s degree in computer science, Web Development, or a related field (or equivalent practical experience).
  • Minimum of 4 to 8 years of professional experience in Java

development.

  • Strong proficiency in Java and object-oriented programming.
  • Minimum of 4 years of experience in building microservices with Spring Boot.
  • Solid understanding of RESTful APIs and experience with API design and integration.
  • Strong problem-solving skills and the ability to think critically.
Read more
NeoGenCode Technologies Pvt Ltd
Gurugram, Vadodara
4 - 10 yrs
₹6L - ₹16L / yr
skill iconNodeJS (Node.js)
skill iconPython
skill iconReact.js
skill iconNextJs (Next.js)
RESTful APIs
+10 more

Job Title : Full Stack Developer (Crypto Exchange)

Experience : 4+ Years

Location : Gurugram & Vadodara (On-site)


Role Overview :

We are looking for a Full Stack Developer with strong expertise in both backend and frontend development, along with exposure to crypto exchange systems or fintech platforms.

In this role, you will work on building high-performance, real-time trading applications, contributing to core systems like order execution, pricing engines, and wallet integrations.


Key Responsibilities :

  • Design, develop, and maintain scalable backend services and APIs.
  • Build and optimize responsive frontend applications for trading interfaces.
  • Work on real-time systems such as order books, pricing engines, and trade execution.
  • Integrate with blockchain networks, wallets, and third-party APIs.
  • Ensure platform security, performance, and reliability.
  • Collaborate with product, design, and DevOps teams for end-to-end delivery.
  • Participate in system design, architecture discussions, and code reviews.


Required Skills & Qualifications :

  • 4+ years of experience in Full Stack Development.
  • Strong expertise in :
  • Backend : Node.js and/or Python
  • Frontend : React.js and/or Next.js
  • Experience with REST APIs and microservices architecture.
  • Strong understanding of databases (MongoDB, PostgreSQL, MySQL, etc.).
  • Hands-on experience with Docker and cloud platforms (AWS preferred).
  • Solid understanding of system design, scalability, and performance optimization.


Preferred (Good to Have) :

  • Experience working with a crypto exchange or trading platform.
  • Understanding of blockchain fundamentals (Ethereum, Bitcoin, etc.).
  • Experience with wallet integrations and on-chain transactions.
  • Familiarity with WebSockets and real-time data streaming.
  • Knowledge of security best practices in fintech/crypto systems.

Why Join Us ?

  • Opportunity to work on a high-impact, real-world crypto exchange.
  • Build and scale systems from early-stage to production.
  • Work in a fast-paced, ownership-driven environment.
  • Exposure to cutting-edge blockchain and trading technologies.
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dharati Thakkar
Pune
5 - 10 yrs
₹21L - ₹30L / yr
skill iconPython
skill iconMachine Learning (ML)
Generative AI (GenAI)
SQL
skill iconDeep Learning
+11 more

JOB DETAILS:

- Job Title: Lead I - Data Science - Python, Machine Learning, Spark 

- Industry: Global Digital Transformation Solutions Provider

- Experience: 5-10 years

- Job Location: Pune

- CTC Range: Best in Industry

 

JD for Data Scientist

Hands-on experience with data analysis tools:

Proficient in using tools such as Python and R for data manipulation, querying, and analysis.

Skilled in utilizing libraries like Pandas, NumPy, and Scikit-Learn to perform in-depth data analysis and modeling.

 

Skilled in machine learning and predictive analytics:

Expertise in building, training, and deploying machine learning models using frameworks such as TensorFlow and PyTorch.

Capable of performing tasks like regression, classification, clustering, and recommendation, leading to data-driven predictions and insights.

 

Expertise in big data technologies:

Proficient in handling large datasets using big data tools such as Spark.

Skilled in employing distributed computing and parallel processing techniques to ensure efficient data processing, storage, and analysis, enabling enterprise-level solutions and informed decision-making

 

Skills: Python, SQL, Machine Learning, and Deep Learning, with mandatory expertise in Generative AI.

 

Must-Haves

5–9 years of relevant experience in Python, SQL, Machine Learning, and Deep Learning, with mandatory expertise in Generative AI

 

******

NP - Immediate joiners only

Location-Pune 

Read more
Bell Techlogix
Pemmraju VenkatVandita
Posted by Pemmraju VenkatVandita
Hyderabad
5 - 10 yrs
₹15L - ₹20L / yr
Generative AI
Microsoft Windows Azure
skill iconPython
SQL
Windows Azure
+1 more

The AI Data Engineer will be responsible for designing, building, and operating scalable data pipelines and curated data assets that power machine learning, generative AI, and intelligent automation solutions in an SLA-driven managed services environment. This role focuses on data ingestion, transformation, governance, and operational reliability across cloud and hybrid environments enabling use cases such as knowledge retrieval (RAG), conversational AI, predictive analytics, and AI-assisted service management. The ideal candidate combines strong data engineering fundamentals with an understanding of AI workload requirements, including quality, lineage, privacy, and performance. 

 

Key Responsibilities 

•Design, build, and operate production-grade data pipelines that support AI/ML and generative AI workloads in managed services environments 

•Develop curated, analytics-ready datasets and data products to enable model training, grounding, feature generation, and AI search/retrieval 

•Implement data ingestion patterns for structured and unstructured sources (APIs, databases, files, event streams, documents) 

•Build and maintain transformation workflows with strong testing and validation 

•Enable Retrieval-Augmented Generation (RAG) by preparing document corpora, chunking strategies, metadata enrichment, and vector indexing patterns 

•Integrate data pipelines with application services 

•Support ITSM and enterprise workflow data needs, including ServiceNow data integration, CMDB/incident data quality improvements, and automation enablement 

•Implement observability for data pipelines (monitoring, alerting, SLAs/SLOs) and perform root cause analysis for pipeline failures or data quality incidents 

•Apply data governance and security best practices 

•Collaborate with ML Engineers, DevOps/SRE, and solution architects to operationalize end-to-end AI solutions 

•Contribute to reusable patterns, templates, and standards within the Bell Techlogix AI Center of Excellence 

 

Required Qualifications 

•Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent practical experience 

•5+ years of experience in data engineering, analytics engineering, or platform data operations 

•Strong proficiency in SQL and Python; experience with data modeling and dimensional concepts 

•Hands-on experience with Azure data services (e.g., Data Factory, Synapse, Databricks, Storage, Key Vault) or equivalent cloud tooling 

•Experience building reliable pipelines with scheduling, dependency management, and automated testing/validation 

•Experience supporting production data platforms with incident management, troubleshooting, and root cause analysis 

•Understanding of data security, privacy, and governance principles in enterprise environments 

 

Preferred Qualifications 

•Experience enabling AI/ML workloads: feature engineering, training data preparation, and integration with Azure Machine Learning 

•Experience with unstructured data processing for generative AI 

•Familiarity with vector databases or vector search and RAG patterns 

•Experience with event streaming and messaging 

•Familiarity with ServiceNow data model and integration patterns (Table API, export, CMDB/ITSM reporting) 

•Relevant certifications (Microsoft Azure Data Engineer, Azure AI Engineer, Databricks) 

Read more
Remote only
4 - 8 yrs
Best in industry
skill iconPHP
skill iconJavascript
Artificial Intelligence (AI)
Architecture
SQL

Team  -Support Operations — Technical Solutions 

Level  - IC3 (4–7 years of relevant experience) 

Location  - India (Remote) IST time zone, with overlap with US East/Central teams 

Reports To -Technical Manager 

Manages -Not a people-manager role, but a lead role with real technical authority 

Employment Type -Full-time 

 

ABOUT DELTEK 

Deltek is the leading global provider of software and solutions for project-based businesses — serving government contractors, professional services firms, and architecture & engineering companies. Our products help customers manage the full project lifecycle, from winning work and planning resources to executing delivery and getting paid. 

The Support Operations Technical Solutions team sits inside Deltek's Customer Success organization. We build and maintain the internal tooling, integrations, and AI-powered workflows that allow Deltek's support and customer success teams to operate at scale — intelligent case routing, knowledge-base agents, data pipelines between Salesforce, Gainsight, and Oracle Service Cloud, and automation that removes manual work from high-volume support processes. 

 

THE ROLE 

We are looking for a Senior System Engineer to take technical ownership of our most complex solutions. This is not a management role — it is a senior individual contributor role with real architectural authority and a multiplier effect on the team around you. 

You own problems end-to-end. You design the solution before writing the first line, consider downstream impacts before committing to an approach, and hold the technical bar for the work your team delivers. You are the person a junior engineer turns to when they're stuck, and the person a business stakeholder trusts to tell them whether an idea is feasible and what it will cost to maintain. 

In your first year, you can expect to: 

  • Own the end-to-end design and delivery of major integrations and AI-enabled components from architecture through deployment and post-launch stability 
  • Lead solution design for the team's most complex problems using PHP, JavaScript, Workato, APIs, and Web Services 
  • Evaluate technology and platform tradeoffs and make defensible, documented recommendations that balance short-term delivery with long-term maintainability 
  • Apply AI, automation, and agentic architectures to business problems at production scale — not as experiments, but as shipped systems 
  • Anticipate performance, operational, and security risks before they reach production; design with those constraints in mind from day one 
  • Set engineering standards and review the work of IC1/IC2 engineers, making them better through structured feedback and clear design expectations 
  • Partner directly with CS operations leadership and cross-functional stakeholders to translate ambiguous business needs into concrete technical strategies 

 

This role suits an engineer who is past proving they can build things, and is now focused on building the right things in the right way — and helping others do the same. 

 

WHAT WE'RE LOOKING FOR 

Must-Have Technical Skills 

  • PHP and JavaScript: Production depth: You have designed and shipped non-trivial systems in these languages. You understand performance characteristics, know where the footguns are, and write code you'd be comfortable having reviewed by a senior peer. 
  • Integration architecture: You have designed system-to-system integrations — not just consumed APIs. You understand data flow, transformation logic, error handling, retry strategies, and idempotency. 
  • AI / LLM applied experience: You have built or led the build of AI-assisted workflows, LLM-based tools, or agentic systems in an operational or product context. You know the difference between a demo and a production-grade AI system. 
  • Relational databases: Query and schema design: You write optimized SQL, design schemas with long-term maintainability in mind, and understand when a query will cause production problems before it does. 
  • Full-stack troubleshooting at depth: You can diagnose complex, multi-layer issues — across front-end, API, back-end, and database — and trace the root cause without being handed a reproduction case. 
  • Technical tradeoff analysis: When evaluating tools, platforms, or approaches, you can articulate the tradeoffs clearly — not just pick what you know best — and document the rationale in a way that holds up six months later. 
  • Agile technical leadership: You have led technical workstreams in a sprint-based environment: broken down epics, written meaningful acceptance criteria, and been accountable for team delivery quality. 
  • Documentation and design artifacts: You produce architecture diagrams, solution designs, and technical decision records that others can act on — not just notes for yourself. 

Must-Have Leadership & Soft Skills 

  • Technical mentorship: You actively make the engineers around you better. Code reviews are teaching opportunities, not gatekeeping. Design reviews are conversations, not approvals. 
  • Stakeholder communication: You can translate a technical constraint into a business impact, and a business requirement into a technical specification. You don't hide behind jargon or over-simplify to avoid hard conversations. 
  • Ownership under ambiguity: When a problem is poorly defined, you ask the right questions to define it — then own the answer. You don't wait for complete requirements before starting to think. 
  • Proactive risk management: You raise issues before they become incidents. You've learned from production failures and carry those lessons into design decisions. 
  • Business context awareness: You understand how the systems you build affect end users and business operations. You've made engineering decisions informed by that context, not just by technical preference. 

Nice-to-Have Skills 

Prioritized by relevance to this team's current and near-term roadmap: 

Oracle Service Cloud 

Workato / iPaaS 

Salesforce 

Gainsight 

Agentic AI / LLM Ops 

Snowflake 

Microsoft Power BI 

Microsoft Power Apps 

Cloud-native development 

 

Experience designing agentic AI systems — not just integrating LLM APIs — is highly relevant to where this team is going. Candidates who have shipped multi-step agent architectures with tool-calling, memory, and guardrails will stand out. 

 

RESPONSIBILITIES 

Design & Architecture 

  • Own end-to-end technical solution design — from requirements through architecture, implementation, and post-launch stability — for the team's most complex initiatives 
  • Lead solution design using PHP, JavaScript, Workato, APIs, and Web Services; ensure solutions are scalable, maintainable, and aligned with established governance standards 
  • Evaluate tradeoffs across tools, platforms, and architectural patterns; produce documented recommendations that account for both short-term delivery needs and long-term operational cost 
  • Anticipate downstream impacts, performance bottlenecks, and operational risk during the design phase — not as an afterthought in retrospect 
  • Author and maintain Architecture Decision Records (ADRs) and technical design documents for all major solution components 

AI, Automation & Integration 

  • Apply AI, automation, and agentic architectures to complex business problems at production scale — designing for reliability, observability, and graceful failure 
  • Lead the integration of AI-enabled components (LLM workflows, intelligent routing, agentic tools) into the team's operational platform 
  • Design and oversee integrations between Deltek's CS platforms (Oracle Service Cloud, Salesforce, Gainsight) and internal data systems, ensuring data integrity, performance, and auditability 
  • Evaluate new AI frameworks, LLM providers, and automation platforms; provide grounded, implementation-level recommendations rather than theoretical assessments 

Technical Leadership & Mentoring 

  • Serve as the primary technical reviewer for IC1/IC2 engineers — conducting structured code and design reviews that build capability, not just ship code 
  • Break down complex initiatives into well-scoped workstreams that junior engineers can execute with confidence and appropriate independence 
  • Establish and enforce engineering standards: code quality, documentation, testing coverage, deployment practices, and incident response 
  • Identify skill gaps in the team and work with the manager to address them through pairing, documentation, or structured learning 

Stakeholder & Cross-functional Engagement 

  • Translate ambiguous business and operational requirements from CS leadership into concrete technical strategies with clear milestones and measurable outcomes 
  • Engage directly with senior stakeholders — CS operations leads, product owners, IT — to align on priorities, surface risks, and manage technical expectations 
  • Represent the technical perspective of the team in cross-functional planning and architecture discussions 

Operate & Improve 

  • Own post-launch stability of solutions you design: monitor, respond to incidents, and drive root-cause resolution — not just resolution 
  • Drive continuous improvement of the team's delivery practices: identify process friction, propose solutions, and follow through on implementation 
  • Stay current on AI, automation, and integration technology evolution; bring relevant advances back to the team with a concrete point of view on applicability 

 

QUALIFICATIONS 

  • Education: Bachelor's degree in Computer Science, Electrical or Electronics Engineering, or a related technical discipline. Equivalent demonstrated experience considered. 
  • Experience: 4–7 years of hands-on experience in software engineering, systems integration, or closely related work, with at least 2 years at a level where you have owned technical design decisions — not just implemented them. 
  • Coding evidence: A portfolio, GitHub profile, architecture document, or production system you can speak to in depth. At IC3, we expect you to be able to walk through a non-trivial design decision you made and defend the tradeoffs. 
  • AI / ML: Practical, production-level experience with LLMs or AI tooling — not just prompt engineering or personal experimentation. Familiarity with frameworks such as LangChain, OpenAI APIs, or similar platforms is a strong plus. 
  • Collaboration model: Comfortable working as a technical authority in a distributed team. The role requires regular IST overlap with US East/Central stakeholders (approximately 6:30 PM – 10:30 PM IST for at least part of the week). 
  • Language: Strong written and spoken English. At IC3, much of your influence operates through written design documents, async reviews, and stakeholder communications. Precision in writing matters. 

 

 

WHAT TO EXPECT WORKING HERE 

  • Technical authority with real impact — your design decisions ship to production and affect how thousands of Deltek customers experience support 
  • Exposure to production AI/agentic systems and direct involvement in shaping where the team's AI roadmap goes next 
  • A team where senior engineers are trusted to lead, not managed step-by-step — you will have autonomy commensurate with your accountability 
  • Structured growth path: IC3 engineers who demonstrate architectural leadership and cross-functional influence have a clear track toward Staff or Associate Director scope 
  • Regular 1:1s, design review forums, and a manager who will invest in your growth rather than just your output 


Read more
Remote only
1 - 4 yrs
Best in industry
skill iconPHP
skill iconJavascript
AI Coding Tools
Artificial Intelligence (AI)
Large Language Models (LLM) tuning
+1 more

 

Team -Support Operations — Technical Solutions 

Level 

IC2 (1–3 years of relevant experience) 

Location 

India (Remote) — IST time zone, with overlap with US East/Central teams 

Reports To  Tech Manager 

Employment Type  Full-time 

 

ABOUT DELTEK 

Deltek is the leading global provider of software and solutions for project-based businesses, serving government contractors, professional services firms, and architecture & engineering companies. Our products help customers manage the full project lifecycle — from winning work and planning resources to executing delivery and getting paid. 

The Support Operations Technical Solutions team sits inside Deltek's Customer Success organization. We build and maintain the internal tooling, integrations, and AI-powered workflows that enable Deltek's support and customer success teams to operate at scale — think intelligent case routing, knowledge-base agents, data pipelines between Salesforce, Gainsight, and Oracle Service Cloud, and automation that removes manual work from high-volume support processes. 

THE ROLE 

We are looking for a System Engineer (IC2) to join our Technical Solutions team based in India. This is a hands-on engineering role, you will build, integrate, and support the systems that power our customer-facing and internal support operations. 

In your first year, you can expect to: 

  • Build and maintain integrations between support platforms (Oracle Service Cloud, Salesforce, Gainsight) using PHP, JavaScript, and Workato 
  • Contribute to AI-assisted workflow automation — including LLM-based tools and intelligent routing solutions already in production 
  • Write and optimize SQL queries against our operational data stores to power dashboards, reports, and automated triggers 
  • Troubleshoot issues across the full stack: front-end, API layer, back-end logic, and database and document root cause findings 
  • Work in a sprint-based environment alongside engineers, CS operations leads, and product stakeholders across the US and India  

This role is well-suited for someone who is early in their career but already has real project or production experience. You will work with guidance from senior engineers while taking genuine ownership of defined workstreams. The expectation is not that you know everything on day one — it is that you are technically curious, structured in your thinking, and driven to ship things that work. 

 

WHAT WE'RE LOOKING FOR 

Must-Have Technical Skills 

  • PHP and JavaScript: Hands-on experience building or maintaining web applications, APIs, or internal tools. You have written code that went somewhere beyond your laptop. 
  • REST/SOAP APIs and Web Services: You understand how system-to-system data flows work and have built or consumed integrations in a real context. 
  • Relational databases and SQL: You can write optimized queries, understand joins and indexes, and are comfortable reading a schema you didn't design. 
  • Full-stack troubleshooting: When something breaks, you know how to methodically trace the issue across front-end, back-end, and database layers — not just escalate it. 
  • Documentation: You can translate what you built into clear written artifacts — requirements, workflow diagrams, solution designs — that a non-engineer can follow. 
  • Agile/sprint delivery: You have worked in a structured sprint environment and are comfortable with ceremonies, tickets, and incremental delivery. 

Must-Have Soft Skills 

  • Root-cause orientation: You don't patch symptoms and move on. You want to understand why something broke before deciding how to fix it. 
  • Self-driven with good judgment: You can manage your own time on a defined problem, identify when you're stuck and need input, and flag risks before they become blockers. 
  • Clear communicator across audiences: You can explain a technical problem to a non-technical stakeholder and a design decision to a senior engineer — in writing and in a call. 
  • Collaborative: You work well with people you've never met in person, across time zones, and with stakeholders who don't share your technical background. 

Nice-to-Have Skills 

The following are not required for the role, but candidates with depth in any of these areas will stand out. Listed in rough order of relevance to this team's current work: 

 

Oracle Service Cloud 

Workato / iPaaS 

Salesforce 

Gainsight 

AI / LLM integration 

Snowflake 

Microsoft Power BI 

Microsoft Power Apps 

Cloud-native development 

 

Experience with AI tools (GitHub Copilot, LLM APIs, automation agents) used in an operational or product context — not just personal experimentation — is a genuine plus for this team

 

RESPONSIBILITIES 

At the IC2 level, you will primarily execute within defined frameworks and grow your independent scope over time. The following reflects what you will own and contribute to: 

 

Build & Integrate 

  • Build and maintain AI-enabled workflows, platform integrations, and internal tools using PHP, JavaScript, Workato, and Web Services 
  • Develop prototypes and proofs of concept; contribute to production deployments under senior guidance 
  • Implement and test integrations between Deltek's support platforms and internal data systems 

Analyse & Solve 

  • Break down defined problems into actionable tasks; identify risks, dependencies, and edge cases before they surface in production 
  • Troubleshoot complex issues across the full stack and document root cause findings clearly 
  • Investigate stakeholder-reported issues to identify whether the problem is technical, process-related, or both 

 

Operate & Improve 

  • Follow established governance, architecture, and deployment processes; raise improvement suggestions through proper channels 
  • Write and maintain documentation for systems, workflows, business rules, and solution designs 
  • Participate actively in sprint ceremonies; manage your own tasks and flag blockers early 
  • Demonstrate continuous learning in AI, automation, and integration technologies — this space moves fast and curiosity is part of the job 

 

QUALIFICATIONS 

  • Education: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related technical discipline. Equivalent practical experience considered. 
  • Experience: 1–3 years of hands-on experience in software engineering, systems integration, or a closely related field. Internship and co-op experience counts if it involved real production systems. 
  • Coding: Demonstrable PHP and/or JavaScript experience — a portfolio, GitHub profile, or code sample you can speak to will strengthen your application. 
  • Collaboration model: Comfortable working remotely with distributed teams. The role requires regular overlap with US East/Central time zones (approximately 6:30 PM – 10:30 PM IST for at least part of the week). 
  • Language: Strong written and spoken English is essential — much of the collaboration with stakeholders and senior engineers happens asynchronously in writing. 

 

WHAT TO EXPECT WORKING HERE 

  • A small, technically-focused team where your work is visible and your contributions are directly tied to outcomes customers feel 
  • Exposure to production AI/LLM systems, not just theoretical discussions about AI 
  • A culture that values root-cause thinking and good documentation over heroics and quick fixes 
  • Growth path: engineers who demonstrate technical depth and ownership at IC2 have a clear track toward IC3 (mid-level) scope within 18–24 months 
  • Regular 1:1s and structured feedback — this team invests in making you better, not just keeping you busy 


Read more
ACTOSOFT
priyanka sharma
Posted by priyanka sharma
Surat
0 - 1 yrs
₹1.2L - ₹2.5L / yr
DotNetNuke
SQL
ASP.NET MVC

Actosoft is a software developing and company that offers complete IT solutions. We are a part of that

We are a collective of focused, energetic, talented, and hardworking professionals who believe in getting things done at the highest level. Our team aims to innovate, be authentic and grow in everything that we do.

The ideal candidate should be familiar with the complete software design life cycle. In addition, they should have experience in designing, coding, testing and consistently managing applications. They should be comfortable coding in multiple languages and be able to test codes to maintain high-quality coding.

Job details:

 

Job Location: Actosoft, Gajera Rd, beside Avalon Business Hub, Katargam, Surat, Gujarat 395004

Experience: 0 to 1years of experience

Salary: 10,000 to 20,000 (per month)

Job Type: Full-time – Work from Office

Working Schedule:

 

9:00 am to 6:00 pm (Monday to Friday)

Alternate Saturdays Off.

Job Responsibilities:

 

●       Design, code, test, and manage various applications

●       Collaborate with the engineering team and product team to establish the best products

●       Follow outlined standards of quality related to coding and systems

 

●       Develop automated tests and conduct performance tuning

●       Ability to create and support documentation for all new applications

 

●       Willing to work as a team member

Qualifications:

 

●       Bachelor's degree in Computer Science or relevant field, like MCA, BCA, or BE

●       Experience developing web-based applications in C#, HTML, VBScript/ASP, and .NET

●       Experience working with MS SQL Server and MySQL Knowledge of practices and procedures for full software design life cycle

●       Experience in working with an agile development company

 

Required Skills:

●       .NET Framework

●       C#

●       Microsoft SQL Server

●       JavaScript

●       jQuery

●       ASP.NET MVC

●       ASP.NET Web API

●       HTML

●       WCF Services

●       PL/SQL

●       Anqular

●       Entity Framework

●       CSS

●       Ajax

●       XML

 

Perks and Benefits:

1. Evaluation for Bonus and Promotion every year.

2. Incredible opportunity to diversify your writing skills by working with experts on unique projects.

 

Website:

http://www.actosoft.in/

Industry

  • Computer Software

Employment Type

Full-time

Edit job description

 

Read more
Applix

at Applix

3 candid answers
Eman Khan
Posted by Eman Khan
Hyderabad
5 - 8 yrs
Upto ₹18L / yr (Varies
)
QAD
SQL
Enterprise Resource Planning (ERP)
Troubleshooting

Job Summary

We are looking for a strong QAD Developer to support a US-based client from our Applix offshore delivery center. The role requires a self-driven engineer who can independently handle QAD-related customizations, enhancements, implementation support, troubleshooting, and ongoing production support.

The ideal candidate should be comfortable working directly with functional stakeholders, understanding business requirements, converting them into technical solutions, and supporting deployment and stabilization activities with minimal supervision.


Shift:

  • Second shift / US overlap
  • Regular working hours will extend up to 11:30 PM IST on certain business days.


Required Skills

  • Strong hands-on experience in QAD ERP development and customization
  • Good understanding of QAD technical architecture
  • Experience in custom development, reports, forms, interfaces, and enhancements
  • Good understanding of manufacturing/business process flows in ERP environments
  • Ability to troubleshoot production issues independently
  • Strong SQL knowledge for data analysis, backend troubleshooting, and query handling
  • Experience supporting implementations, rollouts, or enhancement projects
  • Good communication skills and ability to interact with US-based teams


Preferred Skills

  • Experience in manufacturing industry environments
  • Exposure to integrations, EDI, or external system interfaces
  • Experience supporting QAD implementations or upgrades
  • Familiarity with change management, release processes, and production support practices

 

Key Traits

  • Self-sufficient and proactive
  • Able to work with minimal supervision
  • Strong ownership mindset
  • Comfortable in a client-facing offshore support model
  • Able to handle second-shift working hours consistently
  • Excellent verbal and written communication skills, with the ability to clearly explain technical issues, progress, risks, and dependencies to US-based client teams
  • Proactive ownership mindset, with the ability to independently drive QAD customizations, issue resolution, and implementation tasks from analysis through closure with minimal supervision


Key Responsibilities

  • Develop, customize, and support QAD ERP solutions based on business requirements
  • Handle QAD-related enhancements, bug fixes, and implementation activities
  • Work on forms, reports, custom programs, interfaces, and data handling within the QAD environment
  • Analyze functional requirements and convert them into technical design and development tasks
  • Support issue investigation, root cause analysis, and defect resolution in production and non-production environments
  • Collaborate with client stakeholders, functional teams, and internal delivery teams during requirement clarification, development, testing, and deployment
  • Perform unit testing and support SIT/UAT cycles
  • Assist in data migration, configuration support, and deployment activities as needed
  • Maintain proper technical documentation for customizations, fixes, and implementation changes
  • Work independently during offshore support hours and provide timely progress and issue updates
Read more
Applix

at Applix

3 candid answers
Ariba Khan
Posted by Ariba Khan
Hyderabad
4 - 7 yrs
Upto ₹18L / yr (Varies
)
SQL
Relational Database (RDBMS)
Database Design
Troubleshooting

Job Summary

We are looking for a strong SQL Developer to support a US-based client from our Applix offshore delivery center. This role requires a self-sufficient engineer who can independently manage SQL development, database troubleshooting, data fixes, query optimization, backend support for application changes, and support customizations and implementations tied to business needs.

The ideal candidate should be comfortable working closely with application teams and business stakeholders to understand data flows, support development needs, and resolve production issues with minimal supervision.


Shift:

  • Second shift / US overlap
  • Regular working hours will extend up to 11:30 PM IST on certain business days.


Required Skills

  • Strong hands-on experience in SQL development
  • Strong experience with stored procedures, views, functions, joins, indexing, and performance tuning
  • Good experience in data analysis, troubleshooting, and backend support
  • Ability to write efficient, scalable, and maintainable SQL code
  • Experience supporting production issues and implementing fixes independently
  • Good understanding of database design principles and data integrity
  • Ability to work with application teams on customization and implementation needs
  • Strong communication and problem-solving skills


Preferred Skills

  • Experience supporting ERP applications, preferably manufacturing-related systems
  • Experience with data migration, ETL, reporting, or interface support
  • Exposure to QAD or similar ERP environments
  • Experience in a client-facing offshore support model

 

Key Traits

  • Self-sufficient and dependable
  • Strong analytical mindset
  • Able to independently own issues from analysis to closure
  • Comfortable working in extended overlap with US teams
  • Able to manage priorities with minimal supervision
  • Excellent verbal and written communication skills, with the ability to clearly document findings, explain data/database issues, and provide timely updates to US-based client teams
  • Strong ownership and proactive follow-through, with the ability to independently analyze, troubleshoot, optimize, and close SQL/data-related issues without constant direction


Key Responsibilities

  • Develop, maintain, and optimize SQL queries, stored procedures, functions, views, and backend database objects
  • Support application customizations and implementations through database development and data-level troubleshooting
  • Analyze and resolve production issues related to data, performance, and SQL logic
  • Perform query tuning and performance optimization for existing and new database objects
  • Support data extraction, transformation, validation, and migration activities
  • Work closely with QAD/application teams to support enhancements, integrations, and issue resolution
  • Assist in deployment, testing, and stabilization of new changes
  • Perform root cause analysis for database and data-related issues
  • Maintain technical documentation for database changes, fixes, and support activities
  • Provide reliable offshore support during second shift with timely communication and status updates
Read more
Applix

at Applix

3 candid answers
Ariba Khan
Posted by Ariba Khan
Bangalore, India
3 - 9 yrs
₹20L - ₹37L / yr
Time series
SQL
Neural networks
Snow flake schema
ETL

Responsibilities:

Use quantitative methods such as business simulations, data mining, modeling, and advanced statistical techniques to solve problems. The Data Scientist contributes by serving as a technical lead for analytics initiatives of low‑to‑medium complexity or business impact and supporting high‑profile, enterprise initiatives such as the Engineered Value Chain. 


In this role, you will act as an individual contributor on analytic teams, partnering on cross‑functional projects, and guiding technical delivery. You will also mentor procurement professionals on the technical approaches used to solve problems presented by business units, service organizations, dealers, or customers. 

 

Job duties/Responsibilities include but not limited to:

  • Lead and deliver analytics initiatives by defining analytical approaches, building models, and translating insights into business actions for procurement and enterprise stakeholders. 
  • Develop, train, validate, and monitor predictive models using a broad set of machine learning/statistical methods to support targeted business outcomes. 
  • Design and implement ETL/data pipelines and integrate data sources to create safe, trusted datasets for reporting and analytics (including Snowflake and SQL-based workflows). 
  • Build executive-ready dashboards and decision tools (e.g., Power BI) that enable data‑driven leadership decisions. 
  • Apply data modelling best practices (conceptual, logical, physical models) and support integration/transformation patterns to analytics environments and warehouses. 
  • Partner cross‑functionally (Procurement, Digital/IT, Finance, operations stakeholders) to deploy analytics solutions into production and ensure adoption. 
  • Operate with strong data governance and operational rigor, including troubleshooting data issues, managing access/user needs, and supporting reliable analytics operations. 
  • Use modern engineering practices (e.g., GitLab/DevOps toolchains) to improve repeatability, scalability, and maintainability of analytics solutions. 

 

Must Have Skills

  • Strong AI/ML background across model development and validation, including methods such as time series, clustering, tree-based algorithms, generalized linear models, or neural networks.                           
  • Strong SQL + Snowflake proficiency for ETL, transformation, and analytics-ready datasets. 
  • Experience with cloud solutions, solution integration, IT operations, and data governance. 
  • Proficiency in Python programming language.
  • Proficiency in Prompt Engineering.
  • Experience with front-end technologies such as HTML, CSS, and JavaScript.
  • Experience with back-end technologies such as Django, Flask, or Node.js.
  • Solid grasp of database technologies such as MySQL, PostgreSQL, or MongoDB.
  • Creation of CI,CD Pipelines for ML algorithms, training, prediction pipelines
  • Proficiency in Machine Learning Operations.
  • Work with business, data scientists to ensure value realization, ROI on operationalization of models
  • Strong understanding of statistical analysis and machine learning algorithms.
  • Containerization, packaging of ML models
  • Experience with data visualization tools such as Power BI.
  • Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.
  • Experience with Large Language Models (LLMs) and Natural Language Processing (NLP) technologies.

 

Good To Have Skills:

  • Experience with cloud-first and agile methodologies.
  • Strong understanding of statistical analysis and machine learning algorithms.
  • Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.

 

 

Required Skill:

  • Degree in Computer Science, Business, Mathematics, Economics, Statistics, Engineering, or related field.  
Read more
Hashone Career
Madhavan I
Posted by Madhavan I
Bengaluru (Bangalore)
3 - 8 yrs
₹20L - ₹28L / yr
SQL
skill iconPython
AtScale

Summary:

Data Engineer/Analytics Engineer with experience in semantic layer modeling using AtScale, building scalable data pipelines, and delivering high-performance analytics solutions on cloud platforms.




 Responsibilities

• Build and maintain ETL/ELT pipelines for large-scale data

• Develop semantic models, cubes, and metrics in AtScale

• Optimize query performance and BI dashboards

• Integrate data platforms (Snowflake, Databricks, BigQuery)

• Collaborate with analysts and business teams




 Skills

• SQL, Python/Scala

• Data modeling (star schema, OLAP)

• AtScale (semantic layer)

• Spark, dbt, Airflow

• BI tools (Tableau, Power BI, Looker)

• AWS / GCP / Azure



 Experience

• 3–8+ years in data/analytics engineering

• Experience with enterprise data platforms and BI systems

Read more
ARDEM Incorporated
Remote only
8 - 12 yrs
₹9L - ₹12L / yr
Project delivery
Software Development
Project Management
Team Management
skill icon.NET
+10 more

Senior Project Owner / Project Manager Technology


Department - Technology / Software Development

Work Mode - Work From Home (WFH), Full Time

Experience - Minimum 10 Years (Development Background)

Time Zone - Candidate should be comfortable working in US time zone overlap and attending client calls accordingly.


ROLE SUMMARY

We are looking for a seasoned Senior Project Owner / Project Manager with a strong development foundation to lead our technology initiatives. This role bridges client management and technical execution you will own endto-end delivery of multiple concurrent projects while supporting a high-performing remote team.


KEY RESPONSIBILITIES

Project & Delivery Management

  • Own and manage multiple concurrent technology projects from initiation to production release
  • Define project scope, timelines, milestones, and resource allocation plans
  • Distribute tasks effectively across a team of developers, QA, and support engineers
  • Track assigned work daily, follow up on progress, and proactively remove blockers
  • Ensure all projects meet deadlines and quality benchmarks without compromise
  • Participate actively in production activities and take full accountability for live deployments


US Client Management

  • Serve as the Technology single point of contact for all assigned US clients
  • Attend and lead client calls that are focused on an ARDEM Technical Solution. This may include discussions related to future clients or existing clients (US time zone overlap required)
  • Resolve client queries, manage escalations, and ensure high client satisfaction
  • Showcase company-developed applications and software demos confidently to clients
  • Translate complex client requirements into clear technical deliverables for the team


Team Leadership

  • Lead, mentor, and performance-manage a distributed remote team of technical members
  • Foster accountability, ownership, and a high-delivery culture within the team
  • Conduct sprint planning, stand-ups, retrospectives, and performance reviews
  • Identify skill gaps and work with HR/training teams to bridge them


Process & Operations

  • Deeply understand ARDEM's internal processes and align project execution accordingly
  • Ensure development standards and best practices are followed across all projects
  • Manage crisis situations with composure, identify root causes and drive swift resolution
  • Coordinate with cross-functional teams including HR, Operations, Training, and QA
  • Maintain project documentation, status reports, and risk registers


REQUIRED EXPERIENCE

  • 10+ years of total experience in software development and project management
  • 5–7 years of hands-on coding experience in one or more technologies listed below
  • 2–3 years in a team management or tech lead role overseeing 5+ members
  • Proven experience managing multiple simultaneous projects in a remote/WFH environment
  • Prior experience working with US-based clients strong understanding of US work culture and expectations


TECHNICAL SKILLS

  • Python: scripting, automation, data processing, backend services
  • JavaScript / Node.js: server-side development, REST APIs, async workflows
  • NET Core: enterprise application development and service integration
  • SQL Databases: query optimization, schema design, stored procedures
  • Familiarity with CI/CD pipelines, Git workflows, and deployment processes
  • Ability to review code, understand architectural decisions, and guide the team technically


SKILLS & COMPETENCIES

  • Exceptional verbal and written communication skills in English client-facing confidence is a must
  • Strong crisis management and conflict resolution ability under tight deadlines
  • Highly organized with a structured approach to planning, prioritization, and execution
  • Self-driven and accountable capable of operating independently in a remote environment
  • Strong presentation skills able to demo software to non-technical stakeholders
  • Empathetic leadership style with the ability to motivate and align diverse team members


QUALIFICATIONS

  • Bachelor's or master's degree in computer science
  • PMP Certification: Preferred (candidates without PMP must demonstrate equivalent project management rigor)
  • Agile / Scrum certifications (CSM, PMI-ACP) are an added advantage


LOCATION PREFERENCE

  • Candidates must be based in a Tier-1 city: Mumbai, Delhi NCR, Bengaluru, Hyderabad, Chennai, Pune, or Kolkata
  • This is a full-time Work From Home role: reliable internet, a dedicated workspace, and availability during US business hours are mandatory


ABOUT ARDEM

ARDEM Incorporated is a leading Business Process Outsourcing (BPO) and Automation company serving US based clients across diverse industries. Our Technology Team builds and maintains in-house applications that power data processing pipelines, automation workflows, internal platforms, and domain-specific training modules all engineered to deliver operational excellence at scale. To our clients, we provide cloud-based platforms to assist in their day-to-day business analytics. Our cloud services focus on finance, logistics and utility management.

Read more
BigThinkCode Technologies
Divya Mohandass
Posted by Divya Mohandass
Chennai
4 - 6 yrs
₹7L - ₹16L / yr
SQL
Data engineering
Google BigQuery
Google Cloud Platform (GCP)
Data modeling
+1 more

About the role:

We are looking for a skilled Data Engineer with hands-on expertise in Dagster orchestration or GCP with Bigquery and Apache Airflow, modern data pipeline development, and architecture implementation. The ideal candidate will design, build, and optimize scalable data pipelines with strong SQL proficiency, data modelling expertise.


Key Responsibilities

• Design, develop, and maintain scalable data pipelines using Dagster.

• Build and manage Dagster components such as: o Ops / Assets o Schedules o Sensors o Jobs o Resource definitions

• Implement and maintain Medallion Architecture (Bronze, Silver, Gold layers).

• Write optimized and production-grade SQL scripts for transformations and data validation.

• GCP, Big query, Apache Airflow – expertise is must if not familiar with Dagster and orchestration.


Must Have

• 4+ years of experience in Data Engineering.

• Strong hands-on experience with Dagster (optional) and workflow orchestration.

• Strong hands-on experience with GCP, Big query and Apache Airflow. • Solid understanding of data pipeline design patterns.

• Experience implementing Medallion Architecture.

• Advanced SQL skills (complex joins, CTEs, performance tuning).

• Experience working with GCP cloud data platform.


Why Join Us:

• Collaborative work environment.

• Exposure to modern tools and scalable application architectures.

• Medical cover for employee and eligible dependents.

• Tax beneficial salary structure.

• Comprehensive leave policy

• Competency development training programs.

Read more
Searce Inc

at Searce Inc

3 recruiters
Karthika Senthilkumar
Posted by Karthika Senthilkumar
Coimbatore
7 - 10 yrs
Best in industry
Data engineering
skill iconPython
SQL
Google Cloud Platform (GCP)

Who are we ?


Searce means ‘a fine sieve’ & indicates ‘to refine, to analyze, to improve’. It signifies our way of working: To improve to the finest degree of excellence, ‘solving for better’ every time. Searcians are passionate improvers & solvers who love to question the status quo.


The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday.


Tech Superpowers


End-to-End Ecosystem Thinker: You build modular, reusable data products across ingestion, transformation (ETL/ELT), and consumption layers. You ensure the entire data lifecycle is governed, scalable, and optimized for high-velocity delivery.


The MDS Architect. You reimagine business with the Modern Data Stack (MDS) to deliver Data Mesh implementations and real value. You treat every dataset as a measurable "Data Product with a clear focus on ROI and time-to-insight.


Distributed Compute & Scale Savant: You craft resilient architectures that survive petabyte scale volume and data skew without "breaking the bank. You prove your designs with cost-performance benchmarks, not just slideware.


Al-Ready Orchestrator: You engineer the bridge between structured data and Unstructured/Vector stores. By mastering pipelines for RAG models and GenAl, you turn raw data into the fuel for intelligent, automated workflows.


The Quality Craftsman (Builder @ Heart): You are an outcome-focused leader who lives in the code. From embedding GDPR/PII privacy-by-design to optimizing SQL, Python, and Spark daily, you ensure integrity is baked into every table


Experience & Relevance


Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads


Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.


Al-Native Workflow: You don't just build for Al you build with Al. You must be proficient in using Al coding assistants (e.g.. GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative Al.


Architectural Portfolio: Evidence of leading 2-3 large-scale transformations-including platform migrations, data lakehouse builds, or real-time analytics architectures.


Foster a culture of technical excellence by mentoring and inspiring a team of Data analysts and engineers. Lead deep-dive code reviewa, prompte best-practice data modeling and ensure the squad adopts modern engineering standards like CI/CD For data


Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO's business vision into a Lead Engineer's technical specification without losing anything in translation.


The "Solver" Mindset: A track record of solving 'impossible data problems-whether it's fixing massive data skew, optimizing spiraling cloud costs, or architecting 99.9% available data services.



Read more
Searce Inc

at Searce Inc

3 recruiters
Vaivashhya VN
Posted by Vaivashhya VN
Coimbatore
7 - 10 yrs
Best in industry
Data engineering
Data migration
Datawarehousing
ETL
SQL
+6 more

Who are we ?


Searce means ‘a fine sieve’ & indicates ‘to refine, to analyze, to improve’. It signifies our way of working: To improve to the finest degree of excellence, ‘solving for better’ every time. Searcians are passionate improvers & solvers who love to question the status quo.


The primary purpose of all of us, at Searce, is driving intelligent, impactful & futuristic business outcomes using new-age technology. This purpose is driven passionately by HAPPIER people who aim to become better, everyday.


Tech Superpowers


End-to-End Ecosystem Thinker: You build modular, reusable data products across ingestion, transformation (ETL/ELT), and consumption layers. You ensure the entire data lifecycle is governed, scalable, and optimized for high-velocity delivery.


The MDS Architect. You reimagine business with the Modern Data Stack (MDS) to deliver Data Mesh implementations and real value. You treat every dataset as a measurable "Data Product with a clear focus on ROI and time-to-insight.


Distributed Compute & Scale Savant: You craft resilient architectures that survive petabyte scale volume and data skew without "breaking the bank. You prove your designs with cost-performance benchmarks, not just slideware.


Al-Ready Orchestrator: You engineer the bridge between structured data and Unstructured/Vector stores. By mastering pipelines for RAG models and GenAl, you turn raw data into the fuel for intelligent, automated workflows.


The Quality Craftsman (Builder @ Heart): You are an outcome-focused leader who lives in the code. From embedding GDPR/PII privacy-by-design to optimizing SQL, Python, and Spark daily, you ensure integrity is baked into every table


Experience & Relevance


Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads


Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.


Al-Native Workflow: You don't just build for Al you build with Al. You must be proficient in using Al coding assistants (e.g.. GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative Al.


Architectural Portfolio: Evidence of leading 2-3 large-scale transformations-including platform migrations, data lakehouse builds, or real-time

analytics architectures.


Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO's business vision into a Lead Engineer's technical specification without losing anything in translation.


The "Solver" Mindset: A track record of solving 'impossible data problems-whether it's fixing massive data skew, optimizing spiraling cloud costs, or architecting 99.9% available data services.

Read more
Arcis India
Sarita Jena
Posted by Sarita Jena
Mumbai
6 - 8 yrs
₹12L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Quarkus
Microservices
Webservices
+17 more

6 + years of hands-on development experience and in-depth knowledge of , Spring Java, Spring boot, Quarkus and nice to have front-end technologies like Angular, React JS

● Excellent Engineering skills in designing and implementing scalable solutions

● Good knowledge of CI/CD Pipeline with strong focus on TDD

Strong communication skills and ownership

● Exposure to Cloud, Kubernetes, Docker, Microservices is highly desired.

● Experience in working on public cloud environments like AWS, Azure, GCP w.r.t. solutions development, deployment & adoption of cloud-based technology components like IaaS / PaaS offerings

● Proficiency in PL/SQL and Database development.

Strong in J2EE & OOPS Design Patterns.

Read more
NovacisDigital
Chennai
3 - 8 yrs
₹5L - ₹16L / yr
Relational Database (RDBMS)
Microsoft SQL Server
SQL
dynamic SQL
Stored Procedures
+2 more

Senior Software Engineer – SQL Server / T-SQL

Chennai | IIT Madras Research Park | Full-Time

 

About Novacis Digital

Novacis Digital is a product-first technology company building AI-driven platforms and large-scale data systems. Our products process complex, high-volume data to power real-time analytics and GenAI-driven experiences.

We don’t see SQL as “just a database layer” - we treat it as a core compute engine. If you love writing efficient SQL and solving performance problems, this is the role for you.

 

What You Will Do

·      Design and build complex T-SQL stored procedures involving Dynamic SQL, along with views, functions, and triggers

·      Implement flexible, metadata-driven query frameworks using sp_executesql and parameterized Dynamic SQL

·      Engineer high-performance, set-based queries using CTEs, window functions, temp tables and table variables

·      Optimize queries using execution plans, statistics and DMVs

·      Refactor inefficient queries and redesign schemas for performance and scalability

·      Solve real-world challenges related to locks, blocking, deadlocks and transaction isolation

·      Collaborate with application engineers to build reliable, high-performance data access layers

 

What We’re Looking For

We’re looking for true SQL engineers — people who think in execution flow, logic and data behavior rather than just syntax.

 

You should have:

·      4+ years of deep hands-on experience with Microsoft SQL Server & T-SQL

·      Strong expertise in:

o  Stored Procedures (with Dynamic SQL)

o  Views

o  Functions

o  Triggers

·      Strong experience with:

o  Dynamic SQL best practices and secure execution patterns

o  Indexing strategies and query plan optimization

o  Handling parameter sniffing and plan instability

·      Strong knowledge of:

o  Temp tables vs table variables

o  Cardinality estimation

o  Cost-based optimization concepts


Nice to Have

·      Exposure to GenAI data pipelines or analytical architectures

·      Exposure to Graph, Vector and No SQL Databases

 

How We Work

·      We write production-grade T-SQL

·      We value performance, clarity, and correctness

·      We invest heavily in query readability and maintainability

·      Engineering quality is non-negotiable

 

Apply Now

If you enjoy designing complex Dynamic SQL-powered stored procedures and tuning systems at scale, we’d like to talk.

Read more
Remote only
3 - 6 yrs
₹10L - ₹28L / yr
Python
Selenium
AWS
TestNG
SQL
+2 more

Location: PAN India

💼 Employment Type: Full-Time / Contract

👨‍💻 Experience: 3–6 Years


🔍 Job Overview


We are looking for a talented Automation Test Engineer with strong expertise in Python-based automation, Selenium, and API testing. The ideal candidate will be responsible for building scalable automation frameworks and ensuring high-quality delivery across applications and cloud environments.


🔑 Key Responsibilities

Develop and maintain automation scripts using Python, Selenium, TestNG / Pytest

Perform API testing for RESTful services

Work with AWS services like S3 & API Gateway (basic level)

Conduct database validations using SQL & NoSQL

Integrate automation with CI/CD pipelines (Jenkins, Docker)

Write and maintain test cases, reports, and documentation

Collaborate with cross-functional teams in Agile environments

Debug and resolve automation issues and defects

🛠 Required Skills

Strong experience in Selenium, TestNG / Pytest (Intermediate–Expert)

Proficiency in Python scripting

Experience in RESTful API testing

Knowledge of SQL & NoSQL databases

Hands-on experience with Git (Basic–Intermediate)

Experience with CI/CD tools (Jenkins, Docker)

Basic understanding of AWS (S3, API Gateway)

Scripting knowledge in Shell / Groovy

⭐ Good to Have

Experience in automation framework design

Exposure to cloud-based testing environments



Read more
Pune
3 - 10 yrs
₹1L - ₹10L / yr
skill iconJava
J2EE
API
Java Developer
agile
+15 more

We have an immediate requirement for a Java Developer role in the Pune location. Please find the details below:

Role: Java Developer

Experience: 3–4 Years (Mandatory)

Location: Pune

Joining: Immediate joiners only


Key Responsibilities:

  • Develop and maintain scalable and robust J2EE applications
  • Follow and implement coding standards within the project
  • Integrate with third-party APIs and services
  • Work in an Agile environment to design and implement new features
  • Support team members in resolving technical issues
  • Debug and resolve production issues (code/infrastructure)
  • Communicate effectively with team members and product management

Mandatory Skills:

  • Strong knowledge of Java and JEE internals (Class Loading, Memory Management, Transaction Management, etc.)
  • Expertise in OOPs/OOAD concepts and design patterns
  • Hands-on experience with Spring Framework and Web Services
  • Basic knowledge of JavaScript, jQuery, AJAX, and DOM
  • Good understanding of SQL, relational databases, and ORM (Hibernate/DAO)
  • Strong problem-solving skills and communication abilities

Important Note:

  • Interview is scheduled for Monday
  • Selected candidates are expected to join by Tuesday or Wednesday
Read more
Searce Inc

at Searce Inc

3 recruiters
Srishti Dani
Posted by Srishti Dani
Mumbai, Pune, Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Data migration
Datawarehousing
ETL
SQL
Google Cloud Platform (GCP)
+7 more

Lead Data Engineer


What are we looking for

real solver?

Solver? Absolutely. But not the usual kind. We're searching for the architects of the audacious & the pioneers of the possible. If you're the type to dismantle assumptions, re-engineer ‘best practices,’ and build solutions that make the future possible NOW, then you're speaking our language.


Your Responsibilities

What you will wake up to solve.

  • Lead Technical Design & Data Architecture: Architect and lead the end-to-end development of scalable, cloud-native data platforms. You’ll guide the squad on critical architectural decisions—choosing between Batch vs. Streaming or ETL vs. ELT—while remaining 100% hands-on, contributing high-quality, production-grade code.
  • Build High-Velocity Data Pipelines: Drive the implementation of robust data transports and ingestion frameworks using Python, SQL, and Spark. You will build integration layers that connect heterogeneous sources (SaaS, RDBMS, NoSQL) into unified, high-availability environments like BigQuery, Snowflake, or Redshift.
  • Mentor & Elevate the Squad: Foster a culture of technical excellence by mentoring and inspiring a team of data analysts and engineers. Lead deep-dive code reviews, promote best-practice data modeling (Star/Snowflake schema), and ensure the squad adopts modern engineering standards like CI/CD for data.
  • Drive AI-Ready Data Strategy: Be the expert in designing data foundations optimized for AI and Machine Learning. You will champion the use of GCP (Dataflow, Pub/Sub, BigQuery) and AWS (Lambda, Glue, EMR) to create "clean room" environments that fuel advanced analytics and generative AI models.
  • Partner with Clients as a Technical DRI: Act as the Directly Responsible Individual for client success. Translate ambiguous business questions into elegant data services, manage project deliverables using Agile methodologies, and ensure that the data provided is accurate, consistent, and mission-critical.
  • Troubleshoot & Optimize for Scale: Own the reliability of the reporting layer. You will proactively monitor pipelines, troubleshoot complex transformation bottlenecks, and propose ways to improve platform performance and cost-efficiency.
  • Innovate and Build Reusable IP: Spearhead the creation of reusable data frameworks, custom operators, and transformation libraries that accelerate future projects and establish Searce’s unique technical advantage in the market.


Welcome to Searce


The AI-Native tech consultancy that's rewriting the rules.

Searce is an AI-native, engineering-led, modern tech consultancy that empowers clients to futurify their business by delivering intelligent, impactful, real business outcomes. Searce solvers co-innovate with clients as their trusted transformational partners ensuring sustained competitive advantage. Searce clients realize smarter, faster, better business outcomes delivered by AI-native Searce solver squads. 


Functional Skills 

the solver personas.

  • The Data Architect: This persona deconstructs ambiguous business goals into scalable, elegant data blueprints. They don't just move data; they design the foundation—from schema design to partitioning strategies—that allows data scientists and analysts to thrive, foreseeing technical bottlenecks and making pragmatic trade-offs.
  • The Player-Coach: As a hands-on leader, this persona leads from the front by writing exemplary, production-grade SQL and Python while simultaneously mentoring and elevating the skills of the squad. Their success is measured by the team's ability to deliver high-quality, maintainable code and their growth as engineers.
  • The Pragmatic Innovator: This individual balances a passion for modern data tech (like Generative AI and Real-time Streaming) with a sharp focus on business outcomes. They champion new tools where they add real value but are disciplined enough to choose stable, cost-effective solutions to meet deadlines and deliver robust products.
  • The Client-Facing Technologist: This persona acts as the crucial technical bridge between the data squad and the client. They build trust by listening actively, explaining complex data concepts (like data latency or idempotency) in simple terms, and demonstrating how engineering decisions align with the client’s strategic goals.
  • The Quality Craftsman: This individual possesses an unwavering commitment to data integrity and treats data engineering as a craft. They are the guardian of the reporting layer, advocating for robust testing, data validation frameworks, and clean, modular code to ensure the long-term reliability of the data platform.


Experience & Relevance 

  • Engineering Depth: 7-10 years of professional experience in end-to-end data product development. You have a portfolio that proves your ability to build complex, high-velocity pipelines for both Batch and Streaming workloads.
  • Cloud-Native Fluency: Deep, hands-on experience designing and deploying scalable data solutions on at least one major cloud platform (AWS, GCP, or Azure). You are comfortable navigating the nuances of EMR, BigQuery, or Synapse at scale.
  • AI-Native Workflow: You don’t just build for AI; you build with AI. You must be proficient in using AI coding assistants (e.g., GitHub Copilot) to accelerate your delivery and have a track record of building the data foundations required for Generative AI.
  • Architectural Portfolio: Evidence of leading 2-3 large-scale transformations—including platform migrations, data lakehouse builds, or real-time analytics architectures.
  • Client-Facing Acumen: You have direct experience in a consultative, client-facing role. You can confidently translate a CEO’s business vision into a Lead Engineer’s technical specification without losing anything in translation.


Join the ‘real solvers’

ready to futurify?

If you are excited by the possibilities of what an AI-native engineering-led, modern tech consultancy can do to futurify businesses, apply here and experience the ‘Art of the possible’. Don’t Just Send a Resume. Send a Statement.


Read more
Risosu Consulting LLP
Remote only
2 - 4 yrs
₹6L - ₹9L / yr
skill iconData Analytics
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
API
+1 more

Job Title: Data Analyst (AI/ML Exposure)

Experience: 1–3 Years

Location: Mumbai

Job Description:

We are looking for a Data Analyst with strong experience in data handling, analysis, and visualization, along with exposure to AI/ML concepts. The role involves working with structured and unstructured data (SQL, CSV, JSON), building data pipelines, performing EDA, and deriving actionable insights. Candidates should have hands-on experience with Python (Pandas, NumPy), data visualization tools, and basic knowledge of NLP/LLMs. Exposure to APIs, data-driven applications, and client interaction will be an added advantage.

Skills Required: Python, SQL, Data Analysis, EDA, Visualization, APIs

Apply: Share your resume or connect with us.


Read more
Appiness Interactive
Chennai
6 - 12 yrs
₹10L - ₹24L / yr
skill iconPython
PowerBI
SQL
databricks
Data Warehouse (DWH)
+1 more

Overview


We are looking for a highly skilled Lead Data Engineer with strong expertise in Data Warehousing & Analytics to join our team. The ideal candidate will have extensive experience in designing and managing data solutions, advanced SQL proficiency, and hands-on expertise in Python & POWER BI .


Skills : Python, Databricks, SQL


Key Responsibilities:


  • Design, develop, and maintain scalable data warehouse solutions.
  • Write and optimize complex SQL queries for data extraction, transformation, and reporting.
  • Develop and automate data pipelines using Python.
  • Work with AWS cloud services for data storage, processing, and analytics.
  • Collaborate with cross-functional teams to provide data-driven insights and solutions.
  • Ensure data integrity, security, and performance optimization.

 


Required Skills & Experience:


  • Must have a minimum of 6-10 years of experience in Data Warehousing & Analytics.
  • Must have strong experience in Databricks
  • Strong proficiency in writing complex SQL queries with deep understanding of query optimization, stored procedures, and indexing.
  • Hands-on experience with Python for data processing and automation.
  • Experience working with AWS cloud services.
  • Hands-on experience with reporting tools like Power BI or Tableau.
  • Ability to work independently and collaborate with teams across different time zones.


Read more
CAW.Tech

at CAW.Tech

5 recruiters
Ranjana Singh
Posted by Ranjana Singh
Hyderabad
4 - 6 yrs
Best in industry
skill iconPHP
skill iconLaravel
Object Oriented Programming (OOPs)
MVC Framework
Design patterns
+4 more

We are looking for a Staff Engineer - PHP to join one of our engineering teams at our office in Hyderabad.


What would you do?

  • Design, build, and maintain backend systems and APIs from requirements to production.
  • Own feature development, bug fixes, and performance optimizations.
  • Ensure code quality, security, testing, and production readiness.
  • Collaborate with frontend, product, and QA teams for smooth delivery.
  • Diagnose and resolve production issues and drive long-term fixes.
  • Contribute to technical discussions and continuously improve engineering practices.


Who Should Apply?

  • 4–6 years of hands-on experience in backend development using PHP.
  • Strong proficiency with Laravel or similar PHP frameworks, following OOP, MVC, and design patterns.
  • Solid experience in RESTful API development and third-party integrations.
  • Strong understanding of SQL databases (MySQL/PostgreSQL); NoSQL exposure is a plus.
  • Comfortable with Git-based workflows and collaborative development.
  • Working knowledge of HTML, CSS, and JavaScript fundamentals.
  • Experience with performance optimization, security best practices, and debugging.
  • Nice to have: exposure to Docker, CI/CD pipelines, cloud platforms, and automated testing.


Read more
NeoGenCode Technologies Pvt Ltd
Gurugram, Vadodara
2 - 10 yrs
₹3L - ₹12L / yr
Manual testing
Test Automation (QA)
Crypto Exchange
Selenium
cypress
+8 more

Job Title : Senior QA Engineer (Crypto Exchange Platform)

Experience : 2+ Years

Location : Gurugram & Vadodara

Employment Type : Full-Time


About the Company :

We are a fast-growing crypto exchange platform building secure, scalable, and high-performance trading systems with real-time data and wallet infrastructure.


Role Overview :

We are looking for a Senior QA Engineer to ensure the quality, reliability, and security of our platform. You’ll work on web, mobile, and backend systems, focusing on APIs, trading engines, and real-time systems in a fast-paced agile environment.


Mandatory Skills :

2+ years in QA with strong manual & automation testing, experience in Selenium/Cypress/Playwright, API testing (Postman/REST Assured), CI/CD (Jenkins/GitHub Actions), SQL, and real-time/WebSocket testing.


Key Responsibilities :

  • Create and execute test plans, cases, and strategies
  • Perform functional, regression, integration & API testing
  • Build and maintain automation frameworks
  • Test trading systems, wallets, and real-time data (WebSockets)
  • Track bugs using Jira and collaborate with teams
  • Integrate testing into CI/CD pipelines
  • Ensure performance, stability, and security


Required Skills :

  • Strong experience in automation + functional testing
  • Hands-on with Selenium/Cypress/Playwright
  • Good knowledge of API testing & microservices
  • Experience with CI/CD tools
  • Strong SQL & database validation skills
  • Understanding of Agile & SDLC


Good to Have :

  • Experience in crypto/fintech/trading platforms
  • Knowledge of blockchain, wallets, smart contracts
  • Performance testing (JMeter, K6)
  • Basic security testing knowledge


What We’re Looking For :

  • Strong problem-solving skills
  • Attention to detail
  • Ability to work in a fast-paced environment
  • Good communication & ownership mindset
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore)
4 - 10 yrs
₹10L - ₹30L / yr
skill iconPython
SQL
Spark
skill iconAmazon Web Services (AWS)
Amazon S3
+13 more

Job Title : AWS Data Engineer

Experience : 4+ Years

Location : Bengaluru (HSR – Hybrid, 3 Days WFO)

Notice Period : Immediate Joiner


💡 Role Overview :

We are looking for a skilled AWS Data Engineer to design, build, and scale modern data platforms. The role involves working with AWS-native services, Python, Spark, and DBT to deliver secure, scalable, and high-performance data solutions in an Agile environment.


🔥 Mandatory Skills :

Python, SQL, Spark, AWS (S3, Glue, EMR, Redshift, Athena, Lambda), DBT, ETL/ELT pipeline development, Airflow/Step Functions, Data Lake (Parquet/ORC/Iceberg), Terraform & CI/CD, Data Governance & Security


🚀 Key Responsibilities :

  • Design, build, and optimize ETL/ELT pipelines using Python, DBT, and AWS services
  • Develop and manage scalable data lakes on S3 using formats like Parquet, ORC, and Iceberg
  • Build end-to-end data solutions using Glue, EMR, Lambda, Redshift, and Athena
  • Implement data governance, security, and metadata management using Glue Data Catalog, Lake Formation, IAM, and KMS
  • Orchestrate workflows using Airflow, Step Functions, or AWS-native tools
  • Ensure reliability and automation via CloudWatch, CloudTrail, CodePipeline, and Terraform
  • Collaborate with data analysts and data scientists to deliver actionable insights
  • Work in an Agile environment to deliver high-quality data solutions

✅ Mandatory Skills :

  • Strong Python (including AWS SDKs), SQL, Spark
  • Hands-on experience with AWS data stack (S3, Glue, EMR, Redshift, Athena, Lambda)
  • Experience with DBT and ETL/ELT pipeline development
  • Workflow orchestration using Airflow / Step Functions
  • Knowledge of data lake formats (Parquet, ORC, Iceberg)
  • Exposure to DevOps practices (Terraform, CI/CD)
  • Strong understanding of data governance and security best practices
  • Minimum 4–7 years in Data Engineering (3+ years on AWS)

➕ Good to Have :

  • Understanding of Data Mesh architecture
  • Experience with platforms like Data.World
  • Exposure to Hadoop / HDFS ecosystems

🤝 What We’re Looking For :

  • Strong problem-solving and analytical skills
  • Ability to work in a collaborative, cross-functional environment
  • Good communication and stakeholder management skills
  • Self-driven and adaptable to fast-paced environments

📝 Interview Process :

  1. Online Assessment
  2. Technical Interview
  3. Fitment Round
  4. Client Round
Read more
Remote only
3 - 5 yrs
₹15L - ₹18L / yr
SQL
skill iconPython
Linux/Unix
Large Language Models (LLM) tuning
skill iconMachine Learning (ML)
+1 more

Python Developer (Performance Optimization Focus)

Experience: 3–5 Years

Location: Remote (India-based candidates only)

Employment Type: Full-time


Role Overview

We are seeking a Python Developer with a strong focus on performance optimization and system efficiency. In this role, you will identify bottlenecks, enhance system performance, and contribute to building scalable, high-performance applications in a Linux-based environment.


Key Responsibilities

  • Analyze and troubleshoot performance bottlenecks in applications and systems
  • Optimize code, database queries, and architecture for scalability and speed
  • Design, develop, test, and maintain robust Python applications
  • Work with large datasets and improve data processing efficiency
  • Collaborate with cross-functional teams to improve system reliability and performance
  • Monitor system performance and implement proactive improvements
  • Write clean, maintainable, and efficient code following best practices


Required Skills & Qualifications

  • 3–5 years of hands-on experience in Python development
  • Strong expertise in performance tuning and optimization techniques
  • Experience with debugging and profiling tools
  • Solid understanding of data structures and algorithms
  • Experience with REST APIs and backend development
  • Strong analytical and problem-solving skills


Linux & System Knowledge (Must-Have)

  • Comfortable working in Linux/Unix environments
  • Command-line proficiency, including:
  • File editing (vi, nano)
  • File permissions (chmod, chown)
  • File downloads (wget, curl)
  • Basic file and directory operations


Basic Python Knowledge (Interview Scope)

  • Writing simple scripts and reusable functions
  • String manipulation and data handling
  • Example task: Count words in a file/string efficiently


Good to Have

  • Familiarity with AI/ML concepts or tools
  • Experience optimizing data-intensive or distributed systems
  • Exposure to cloud platforms (AWS, GCP, Azure)


Why Join Us

  • Work on performance-critical systems with real-world impact
  • Fully remote work environment
  • Opportunity to work with modern, scalable technologies
  • Collaborative, growth-focused team culture


Read more
WeAssemble
Meghal Majithia
Posted by Meghal Majithia
Mumbai
3 - 6 yrs
₹5L - ₹8L / yr
Selenium
Playwright
SQL
Test Automation (QA)

We are looking for a highly skilled QA Automation Engineer with at least 3 years of experience to join our dynamic team in Mumbai. The ideal candidate should be proactive, detail-oriented, and ready to hit the ground running.


Company's Name:-WeAssemble

Reach US:- www.weassemble.team

Location:- One International Centre, Prabhadevi, Mumbai 

Working days:- Monday - Friday / Sat & Sun Fixed Off

Location: Prabhadevi , Mumbai

*Key Responsibilities:*

* Design, develop, and execute automated test scripts using industry-standard tools and frameworks.

* Collaborate with developers, business analysts, and product managers to ensure product quality.

* Conduct functional, non-functional, API, integration testing.

* Implement and maintain automation frameworks.

* Contribute to continuous improvement in QA processes.

*Required Skills & Experience:*

* Strong experience in Playwright with JavaScript.

* API Testing Automation (Postman, REST Assured, or equivalent).

* Hands-on experience with CI/CD pipelines (Jenkins, GitHub Actions, GitLab, or similar).

* Solid understanding of software QA methodologies, tools, and processes.

* Ability to identify, log, and track bugs effectively.

* Strong problem-solving and analytical skills.

*Good to Have:*

* Knowledge of performance testing tools.

* Familiarity with cloud platforms (AWS, Azure, or GCP).

Read more
Cglia Solutions LLP
Rajana Harika
Posted by Rajana Harika
Hyderabad
1 - 3 yrs
₹2.4L - ₹4L / yr
Linux administration
skill iconAmazon Web Services (AWS)
skill iconDocker
SQL
PL/SQL
+2 more



Experience: 1–3 Years

Qualification: B.Tech (Computer Science / IT or related field)

Shift Timing: 5:00 PM – 2:00 AM (Late Evening Shift)

Location: Hyderabad 


Job Summary


We are seeking a proactive and detail-oriented Application Support Engineer with 1–3 years of experience in Linux/Windows environments, application servers, and monitoring tools. The candidate will be responsible for ensuring the stability, performance, and availability of applications, along with providing L2/L3 support in a fast-paced production environment.

Key Responsibilities :

  • Provide application support and incident management for production systems.
  • Monitor system performance using hardware/software monitoring and trending tools.
  • Troubleshoot issues in Linux and Windows environments.
  • Manage and support Apache and Tomcat servers.
  • Analyze logs and debug application/system issues.
  • Work on SQL/Oracle databases for query execution, troubleshooting, and performance tuning.
  • Handle deployments and support CI/CD pipelines using tools like Docker and Jenkins.
  • Ensure SLA adherence and timely resolution of incidents and service requests.
  • Coordinate with development, infrastructure, and database teams for issue resolution.
  • Maintain documentation for incidents, processes, and knowledge base articles.
  • Support SaaS applications hosted in data center environments. 

Required Skills :

Strong knowledge of Linux and Windows OS administration

 Experience with Apache and Tomcat servers

 Hands-on experience with monitoring and alerting tools

 Good understanding of log analysis and troubleshooting techniques

Working knowledge of SQL / Oracle databases

 Familiarity with Docker and Jenkins (CI/CD pipelines)

 Understanding of ITIL processes (Incident, Problem, Change Management)

 Knowledge of SaaS applications and data center operations.


Preferred Skills :

 Experience with automation/scripting (Shell, Python, etc.)

 Exposure to cloud platforms (AWS/Azure/GCP) is a plus

 Basic networking knowledge


Soft Skills :

 Strong analytical and problem-solving abilities

 Good communication skills

 Ability to work in night shifts and handle production support

 Team player with a proactive attitude 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort