Cutshort logo
SQL Jobs in Bangalore (Bengaluru)

50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)

Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
OneFin

at OneFin

6 recruiters
Shona Shaju
Posted by Shona Shaju
Bengaluru (Bangalore)
0.5 - 2 yrs
₹4L - ₹5L / yr
RESTful APIs
SQL
skill iconPython
MS-Excel
Communication Skills

We are looking for an integration engineer to assist our rapidly growing customer base. As part of our integration team, you will be the primary point of contact for all integrations. You would be responsible for helping our clients integrate with OneFin APIs, configuring our system for clients and providing ongoing help to them to resolve any issues.


Responsibilities

  1. Understand and explain APIs to clients. Help clients integrate OneFin APIs. Research and identify solutions to issues during integration.
  2. Escalate unresolved issues to appropriate internal teams (e. g. software developers).
  3. Become a product expert for clients.
  4. Configure OneFin system for customized usage by clients. Identify and write internal and external technical articles or knowledge-base entries, like typical troubleshooting steps, workarounds, or best practices, how-to guides etc.
  5. Automate solution of common issues using Python.
  6. Help live clients resolve issues and coordinate with the development team for issue resolution.


Requirements and Qualifications:

  1. Strong verbal and written communication skills.
  2. Experience in writing code in Python.
  3. Understanding web based systems.
  4. Proficient in understanding and writing JSON.
  5. Experience in SQL databases.
  6. Experience working with REST APIs.
  7. Excellent analytical skills, passion for pinning down technical issues, and solving problems.


https://forms.gle/4tEbPAwW7uis9PPX7

Read more
A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.

Agency job
via HyrHub by Neha Koshy
Bengaluru (Bangalore)
1 - 2 yrs
₹7L - ₹14L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
SQL
skill iconPostgreSQL
+3 more

Key Responsibilities:

  • Design and develop backend components and sub-systems for large-scale platforms under guidance from senior engineers.
  • Contribute to building and evolving the next-generation customer data platform.
  • Write clean, efficient, and well-tested code with a focus on scalability and performance.
  • Explore and experiment with modern technologies—especially open-source frameworks—
  • and build small prototypes or proof-of-concepts.
  • Use AI-assisted development tools to accelerate coding, testing, debugging, and learning while adhering to engineering best practices.
  • Participate in code reviews, design discussions, and continuous improvement of the platform.

Qualifications:

  • 0–2 years of experience (or strong academic/project background) in backend development with Java.
  • Good fundamentals in algorithms, data structures, and basic performance optimizations.
  • Bachelor’s or Master’s degree in Computer Science or IT (B.E / B.Tech / M.Tech / M.S) from premier institutes.

Technical Skill Set:

  • Strong aptitude and analytical skills with emphasis on problem solving and clean coding.
  • Working knowledge of SQL and NoSQL databases.
  • Familiarity with unit testing frameworks and writing testable code is a plus.
  • Basic understanding of distributed systems, messaging, or streaming platforms is a bonus.

AI-Assisted Engineering (LLM-Era Skills):

  • Familiarity with modern AI coding tools such as Cursor, Claude Code, Codex, Windsurf, Opencode, or similar.
  • Ability to use AI tools for code generation, refactoring, test creation, and learning new systems responsibly.
  • Willingness to learn how to combine human judgment with AI assistance for high-quality engineering outcomes.

Soft Skills & Nice to Have

  • Appreciation for technology and its ability to create real business value, especially in data and marketing platforms.
  • Clear written and verbal communication skills.
  • Strong ownership mindset and ability to execute in fast-paced environments.
  • Prior internship or startup experience is a plus.
Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 10 yrs
₹11L - ₹15L / yr
skill iconC#
skill icon.NET
ASP.NET
SQL
ADO.NET
+21 more

Role & Responsibilities:

Develop and deliver defect free, web-based applications using C#, ASP.Net and Oracle as per the specifications provided by the Business Analysts.


  • Read and Understand the Functional and Technical Specification and have complete understanding of the work before commencing the work
  • Design, develop, and unit test applications in accordance with established standards.
  • Adhering to high-quality development principles while delivering solutions on-time.
  • Adhere to the Quality Management Standards established in the organization.
  • Understanding of the SDLC process defined in the organization and follow it without any deviation
  • Providing third-level support to the support tickets raised by the business users
  • Analyzing and resolving technical and application Logic related problems
  • Ensure high performance in the application by developing efficient code


Ideal Candidate:

  • Strong .NET Senior Software Engineer Profile
  • Must have 5+ years of hands-on development experience with C#.NET, ASP.NET, ADO.NET.
  • Must have 3+ years of experience in web application development using HTML, CSS, JavaScript/jQuery
  • Must have strong experience in Writing Complex SQL Queries, Stored Procedures, Functions using Oracle / SQL Server.
  • Must have experience in designing, developing, and unit testing applications with SDLC compliance
  • Experience with AJAX, Crystal Reports, and front-end validations using JavaScript/jQuery.
  • ME/MTech (CS) or BE/BTech (CS).
Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
6 - 8 yrs
₹11L - ₹15L / yr
Angular
Software Development
skill iconJavascript
TypeScript
skill iconHTML/CSS
+12 more

Role & Responsibilities:

  • Design, develop, and unit test applications in accordance with established standards.
  • Preparing reports, manuals and other documentation on the status, operation and maintenance of software.
  • Analyzing and resolving technical and application problems
  • Adhering to high-quality development principles while delivering solutions on-time
  • Providing third-level support to business users.
  • Compliance of process and quality management standards
  • Understanding and implementation of SDLC process


Ideal Candidate:

  • Strong Senior Angular Developer Profiles.
  • Must have 6+ years of experience in frontend development, with at least 4+ years in Angular 8+.
  • Must have strong proficiency in JavaScript, TypeScript, HTML5, and CSS3.
  • Must have strong test-driven development experience and proficiency in unit testing frameworks such as Jasmine, Karma, NUnit, Selenium.
  • Must have strong experience in database technologies (MySQL / SQL Server / Oracle)
  • Considering candidates from South India only.
  • Must have 2+ experience with Web APIs, Entity Framework, and Linq Queries.
  • Experience in .NET Core framework, OOP, and C# APIs.
  • Product Companies
  • B.Tech./M.Tech in Computer Science (or related field).
Read more
learners point.org

at learners point.org

2 candid answers
Partha Sarathy
Posted by Partha Sarathy
Bengaluru (Bangalore)
1 - 8 yrs
₹4L - ₹9.6L / yr
Power BI Desk Top
DAX
Time Intelligency
Data Modelling
power BI Query
+4 more

Power BI Analyst – EdTech (UAE Market)

📍 Location: Bangalore (Onsite)

🕔 Working Days: 5 Days

🏢 Industry: EdTech – Professional Training & Certification Programs

🌍 Market Focus: UAE


About Us – Learners Point


Learners Point Academy is a leading professional training institute in the UAE, empowering working professionals and organizations through globally recognised certification programs such as CMA, PMP, ACCA, CIA, and other corporate training solutions.


With a strong presence in the UAE market, we specialise in career-focused education, enterprise workforce development, and high-impact learning solutions designed to drive measurable professional growth.

As we expand our analytics capabilities, we are looking for a skilled


Power BI Analyst to support business intelligence and data-driven decision-making across our Professional Training Programs.


Role Overview


The Power BI Analyst will be responsible for transforming business, learner, and sales data into actionable dashboards and reports that enhance performance tracking, learner engagement, and revenue optimisation.


Key Responsibilities


  • Design, develop, and maintain interactive dashboards using Microsoft Power BI
  • Develop advanced reports using DAX, data modelling, and Power Query
  • Analyze training program performance (enrollments, retention, completion rates, revenue)
  • Build KPI dashboards for:
  • Sales & Reactivation Team
  • Academic & Training Team
  • Leadership & Management
  • Extract and manage data using SQL from databases and CRM systems
  • Automate reporting processes and ensure data accuracy
  • Translate business requirements into technical BI solutions
  • Present insights through clear and compelling data storytelling


Required Technical Skills


  • Strong experience in Power BI (Desktop & Service)
  • Proficiency in:
  • DAX (Measures, Time Intelligence)
  • Data Modeling (Star & Snowflake Schema)
  • Power Query (ETL)
  • Good knowledge of SQL
  • Advanced Excel (Pivot Tables, Power Pivot, Lookup Functions)
  • Experience integrating data from CRM, LMS, or ERP systems


Industry-Specific Requirements (EdTech Focus)


  • Understanding of:
  • Learner engagement metrics
  • Course completion & drop-off analysis
  • Revenue per program
  • Student retention analytics
  • Experience working with Professional Certification Programs is an added advantage
  • Familiarity with UAE market reporting standards preferred


Preferred Skills


  • Exposure to Azure Data Services
  • Dashboard design best practices
  • Ability to manage large datasets
  • Strong analytical mindset with business understanding


Soft Skills


  • Strong communication & stakeholder management skills
  • Business-oriented thinking
  • Problem-solving mindset
  • Attention to detail


Experience & Qualification


  • Bachelor’s Degree in Computer Science, Data Analytics, Statistics, or related field
  • 2–8 years of experience as a Power BI / Data Analyst (EdTech preferred)
  • UAE or GCC market exposure is a plus


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Pune, Mumbai, Bengaluru (Bangalore)
3 - 12 yrs
Best in industry
skill iconPython
pandas
Object Oriented Programming (OOPs)
SQL

JOB DESCRIPTION:


Location: Pune, Mumbai, Bangalore

Mode of Work : 3 days from Office


* Python : Strong expertise in data workflows and automation

* Pandas: For detailed data analysis and validation

* SQL: Querying and performing operations on Delta tables

* AWS Cloud: Compute and storage services

* OOPS concepts

Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹30L - ₹40L / yr
Backend devlopment
skill iconPython
skill iconJava
SQL

Strong Senior Backend Engineer profiles

Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems

Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).

Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework

Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization

Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices

Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)

Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D

Mandatory (Education) – Candidates from Tier - 1 engineering institutes (IITs, BITS, are highly preferred)

Read more
Mirorin

at Mirorin

2 candid answers
Indrani Dutta
Posted by Indrani Dutta
Bengaluru (Bangalore)
4 - 10 yrs
₹6L - ₹15L / yr
skill iconData Analytics
skill iconData Science
SQL
Tableau
OpenAI
+1 more

Job description Data Analyst

 

About Miror

Miror is India’s leading FemTech platform transforming how women experience peri-menopause and menopause. In just a year, we’ve built India’s largest menopause-focused WhatsApp community, partnered with the National Health Mission and the Indian Menopause Society, and launched category-defining nutraceutical products and digital health services. Our app blends science and technology—offering personalized care pathways, symptom tracking, diagnostic links, games, AI-powered chat, expert consultations, and more. We're proud recipients of the Innovation in Menopause Care award at the Global Women’s Health Innovation Conference 2024 and are rapidly scaling toward our $1B+ vision. Learn more: miror.in

 

Role Overview

We’re looking for a Data Analyst who is excited to work at the intersection of data, technology, and women’s wellness. You'll be instrumental in helping us understand user behaviour, community engagement, campaign performance, and product usage across platforms — including app, web, and WhatsApp.

You’ll also have opportunities to collaborate on AI-powered features such as chatbots and personalized recommendations. Experience with GenAI or NLP is a plus but not a requirement.

 

Key Responsibilities

·       Clean, transform, and analyse data from multiple sources (SQL databases, CSVs, APIs).

·       Build dashboards and reports to track KPIs, user behaviour, and marketing performance.

·       Collaborate with product, marketing, and customer teams to uncover actionable insights.

·       Support experiments, A/B testing, and cohort analysis to drive growth and retention.

·       Assist in documentation and communication of findings to technical and non-technical teams.

·       Work with the data team to enhance personalization and AI features (optional).

 

Required Qualifications

·       Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.

·       2 – 4 years of experience in data analysis or business intelligence.

·       Strong hands-on experience with SQL and Python (pandas, NumPy, matplotlib).

·       Familiarity with data visualization tools (Streamlit, Tableau, Metabase, Power BI, etc.)

·       Ability to translate complex data into simple visual stories and clear recommendations.

·       Strong attention to detail and a mindset for experimentation.

 

Preferred (Not Mandatory)

·       Exposure to GenAI, LLMs (e.g., OpenAI, HuggingFace), or NLP concepts.

·       Experience working with healthcare, wellness, or e-commerce datasets.

·       Familiarity with REST APIs, JSON structures, or chatbot systems.

·       Interest in building tools that impact women’s health and wellness.

 

Why Join Us?

·       Be part of a high-growth startup tackling a real need in women’s healthcare.

·       Work with a passionate, purpose-driven team.

·       Opportunity to grow into GenAI/ML-focused roles as we scale.

·       Competitive salary and career progression

 

 

Best Regards,

Indrani Dutta

MIROR THERAPEUTICS PRIVATE LIMITED

Connect with me here or on my LinkedIn page.

Read more
Optimo Capital

at Optimo Capital

2 candid answers
Shantanu Palwe
Posted by Shantanu Palwe
Bengaluru (Bangalore)
0 - 3 yrs
₹20000 - ₹30000 / mo
MS-Excel
SQL
skill iconPython
pandas
skill iconData Analytics
+1 more

Job Description: Data Analyst Intern


Location: On-site, Bangalore

Duration: 6 months (Full-time)


About us:


  • Optimo Capital is a newly established NBFC founded by Prashant Pitti, who is also a co-founder of EaseMyTrip (a billion-dollar listed startup that grew profitably without any funding).
  • Our mission is to serve the underserved MSME businesses with their credit needs in India. With less than 15% of MSMEs having access to formal credit, we aim to bridge this credit gap through a phygital model (physical branches + digital decision-making). As a technology and data-first company, tech lovers and data enthusiasts play a crucial role in building the analytics & tech at Optimo that helps the company thrive.


What we offer:


  • Join our dynamic startup team and play a crucial role in core data analytics projects involving credit risk, lending strategy, credit features analytics, collections, and portfolio management.
  • The analytics team at Optimo works closely with the Credit & Risk departments, helping them make data-backed decisions.
  • This is an exceptional opportunity to learn, grow, and make a significant impact in a fast-paced startup environment.
  • We believe that the freedom and accountability to make decisions in analytics and technology brings out the best in you and helps us build the best for the company.
  • This environment offers you a steep learning curve and an opportunity to experience the direct impact of your analytics contributions. Along with this, we offer industry-standard compensation.


What we look for:


  • We are looking for individuals with a strong analytical mindset, high levels of initiative / ownership, ability to drive tasks independently, clear communication and comfort working across teams.
  • We value not only your skills but also your attitude and hunger to learn, grow, lead, and thrive, both individually and as part of a team.
  • We encourage you to take on challenges, bring in new ideas, implement them, and build the best analytics systems.


Key Responsibilities:

  • Conduct analytical deep-dives such as funnel analysis, cohort tracking, branch-wise performance reviews, TAT analysis, portfolio diagnostic, credit risk analytics that lead to clear actions.
  • Work closely with stakeholders to convert business questions into measurable analyses and clearly communicated outputs.
  • Support digital underwriting initiatives, including assisting in the development and analysis of underwriting APIs that enable decisioning on borrower eligibility (“whom to lend”) and exposure sizing (“how much to lend”).
  • Develop and maintain periodic MIS and KPI reporting for key business functions (e.g., pipeline, disbursals, TAT, conversion, collections performance, portfolio trends).
  • Use Python (pandas, numpy) to clean, transform, and analyse datasets; automate recurring reports and data workflows.
  • Perform basic scripting to support data validation, extraction, and lightweight automation.


Required Skills and Qualifications:

  • Strong proficiency in Excel, including pivots, lookup functions, data cleaning, and structured analysis.
  • Strong working knowledge of SQL, including joins, aggregations, CTEs, and window functions.
  • Proficiency in Python for data analysis (pandas, numpy); ability to write clean, maintainable scripts/notebooks.
  • Strong logical reasoning and attention to detail, including the ability to identify errors and validate results rigorously.
  • Ability to work with ambiguous requirements and imperfect datasets while maintaining output quality.



Preferred (Good to Have):

  • REST APIs: A fundamental understanding of APIs and previous experience or projects related to API development/integrations.
  • Familiarity with basic AWS tools/services: (S3, lambda, EC2, Glue Jobs).
  • Experience with Git and basic engineering practices.
  • Any experience with the lending/finance industry.
Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
7 - 20 yrs
Upto ₹40L / yr (Varies
)
skill iconAmazon Web Services (AWS)
PySpark
SQL

We are hiring an Associate Technical Architect with strong expertise in AWS-based Data Platforms to design scalable data lakes, warehouses, and enterprise data pipelines while working with global teams.


Key Responsibilities

  • Design and implement scalable data warehouse, data lake, and lakehouse architectures on AWS
  • Build resilient and modular data pipelines using native AWS services
  • Architect cloud-based data platforms and evaluate service trade-offs
  • Optimize large-scale data processing and query performance
  • Collaborate with global cross-functional teams (Engineering, QA, PMs, Stakeholders)
  • Communicate technical roadmap, risks, and mitigation strategies

Must-Have Skills

  • 8+ years of experience in AWS Data Engineering / Data Architecture
  • Hands-on experience with AWS services:
  • Amazon S3
  • AWS Glue
  • AWS Lambda
  • Amazon EMR
  • AWS Kinesis (Streams & Firehose)
  • AWS Step Functions / MWAA
  • Amazon Redshift (Spectrum & Serverless)
  • Amazon Athena
  • Amazon RDS
  • AWS Lake Formation
  • AWS DMS, EventBridge, SNS, SQS
  • Strong programming skills in Python & PySpark
  • Advanced SQL with query optimization & performance tuning
  • Deep understanding of:
  • MPP databases
  • Partitioning & indexing strategies
  • Data modeling (Dimensional, Normalized, Lakehouse)
  • Experience building resilient ETL/data pipelines
  • Knowledge of AWS fundamentals:
  • Security
  • Networking
  • Disaster Recovery
  • Scalability & resilience
  • Experience with on-prem → AWS migrations
  • AWS Certification (Solution Architect Associate / Data Engineer Associate)

Good-to-Have Skills

  • Domain experience: FSI / Retail / CPG
  • Data governance & virtualization tools:
  • Collibra
  • Denodo
  • QuickSight / Power BI / Tableau
  • Exposure to:
  • Terraform (IaC)
  • CI/CD pipelines
  • SSIS
  • Apache NiFi, Hive, HDFS, Sqoop
  • Data Mesh architecture
  • Experience with NoSQL databases:
  • DynamoDB
  • MongoDB
  • DocumentDB

Soft Skills

  • Strong problem-solving and analytical mindset
  • Excellent communication and stakeholder management skills
  • Ability to translate technical concepts into business outcomes
  • Experience working with distributed/global teams
Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
3 - 6 yrs
Upto ₹28L / yr (Varies
)
skill iconAmazon Web Services (AWS)
Data engineering
PySpark
SQL
Data migration

As a Senior Data Engineer, you will be responsible for building and delivering a Lakehouse-based data pipeline. This is a hands-on role focused on implementing real-time and batch data ingestion, processing, and delivery workflows, while ensuring strong monitoring, observability, and data quality across the entire pipeline.

Must-Have Skills

  • 3+ years of hands-on experience building large-scale data pipelines
  • Strong experience with Spark Streaming, AWS Glue, and EMR for real-time and batch processing
  • Proficiency in PySpark/Python, including building Kafka producers for data ingestion
  • Experience working with Confluent Kafka and Spark Streaming for ingestion from on-premise sources
  • Solid understanding of AWS services including:
  1. S3
  2. Redshift
  3. Glue
  4. CloudWatch
  5. Secrets Manager
  • Experience working with Medallion Architecture and hybrid data destinations (e.g., Redshift + on-prem Oracle)
  • Ability to implement monitoring dashboards and observability using tools like CloudWatch or Datadog
  • Strong SQL skills for data validation and job-level metrics development
  • Experience building alerting mechanisms for pipeline failures and performance issues
  • Strong collaboration and communication skills
  • Proven ownership mindset — driving deliverables from design to deployment
  • Experience mentoring junior engineers, conducting code reviews, and guiding best practices
  • AWS Certified Data Engineer – Associate (preferred/required)

Good-to-Have Skills

  • Experience with orchestration tools such as Apache Airflow or AWS Step Functions
  • Exposure to Big Data ecosystem tools:
  1. Sqoop
  2. HDFS
  3. Hive
  4. NiFi
  • Exposure to Terraform for infrastructure automation
  • Familiarity with CI/CD pipelines for data workflows
Read more
TalentXO
tabbasum shaikh
Posted by tabbasum shaikh
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹30L / yr
Backend Development
skill iconPython
skill iconJava
SQL


Role & Responsibilities

As a Founding Engineer, you'll join the engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.

This role is ideal for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems require creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.

Key Responsibilities

  • Build core platform features: Develop robust APIs, services, and integrations that power billing automation and revenue recognition capabilities.
  • Work across the full stack: Contribute to backend services and frontend interfaces to ensure seamless user experiences.
  • Implement critical integrations: Connect the platform with external systems including CRMs, data warehouses, ERPs, and payment processors.
  • Optimize for scale: Design systems that handle complex pricing models, high-volume usage data, and real-time financial calculations.
  • Drive quality and best practices: Write clean, maintainable code and participate in code reviews and architectural discussions.
  • Solve complex problems: Debug issues across the stack and collaborate with cross-functional teams to address evolving client needs.

The Impact You'll Make

  • Power business growth: Enable fast-growing B2B companies to scale billing and revenue operations efficiently.
  • Build critical financial infrastructure: Contribute to systems handling high-value transactions with accuracy and compliance.
  • Shape product direction: Join during a scaling phase where your contributions directly impact product evolution and customer success.
  • Accelerate your expertise: Gain deep exposure to financial systems, B2B SaaS operations, and enterprise-grade software development.
  • Drive the future of B2B commerce: Help build infrastructure supporting next-generation pricing models, from usage-based to value-based billing.

Ideal Candidate Profile

Experience

  • 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems.
  • Strong backend development experience using one or more frameworks: FastAPI / Django (Python), Spring (Java), or Express (Node.js).
  • Deep understanding of relevant libraries, tools, and best practices within the chosen backend framework.
  • Strong experience with databases (SQL & NoSQL), including efficient data modeling and performance optimization.
  • Proven experience designing, building, and maintaining APIs, services, and backend systems with solid system design and clean code practices.

Domain

  • Experience with financial systems, billing platforms, or fintech applications is highly preferred.

Company Background

  • Experience working in product companies or startups (preferably Series A to Series D).

Education

  • Candidates from Tier 1 engineering institutes (IITs, BITS, etc.) are highly preferred.



Read more
CloudThat

at CloudThat

1 recruiter
shubhangi shrivastava
Posted by shubhangi shrivastava
Bengaluru (Bangalore)
3 - 6 yrs
₹7L - ₹10L / yr
skill iconHTML/CSS
skill iconPython
skill iconJava
SQL
skill iconC++
+2 more

About CloudThat:-

At CloudThat, we are driven by our mission to empower professionals and businesses to harness the full potential of cloud technologies. As a leader in cloud training and consulting services in India, our core values guide every decision we make and every customer interaction we have.


Role Overview:-

We are looking for a passionate and experienced Technical Trainer to join our expert team and help drive knowledge adoption across our customers, partners, and internal teams.


Key Responsibilities:

• Deliver high-quality, engaging technical training sessions both in-person and virtually to customers, partners, and internal teams.

• Design and develop training content, labs, and assessments based on business and technology requirements.

• Collaborate with internal and external SMEs to draft course proposals aligned with customer needs and current market trends.

• Assist in training and onboarding of other trainers and subject matter experts to ensure quality delivery of training programs.

• Create immersive lab-based sessions using diagrams, real-world scenarios, videos, and interactive exercises.

• Develop instructor guides, certification frameworks, learner assessments, and delivery aids to support end-to-end training delivery.

• Integrate hands-on project-based learning into courses to simulate practical environments and deepen understanding.

• Support the interpersonal and facilitation aspects of training fostering an inclusive, engaging, and productive learning environment


Skills & Qualifications:

• Experience developing content for professional certifications or enterprise skilling programs.

• Familiarity with emerging technology areas such as cloud computing, AI/ML, DevOps, or data engineering.


Technical Competencies:

  • Expertise in languages like C, C++, Python, Java
  • Understanding of algorithms and data structures 
  • Expertise on SQL 

Or Directly Apply-https://cloudthat.keka.com/careers/jobdetails/95441


Read more
AI Industry

AI Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Gurugram
5 - 17 yrs
₹34L - ₹45L / yr
Dremio
Data engineering
Business Intelligence (BI)
Tableau
PowerBI
+51 more

Review Criteria:

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Role & Responsibilities:

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Preferred:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Bengaluru (Bangalore)
4 - 6 yrs
₹8L - ₹14L / yr
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconHTML/CSS
skill iconJavascript
SQL
+2 more


Key Responsibilities :


- Develop backend services using Node.js, including API orchestration and integration with AI/ML services.


- Implement frontend redaction features using Redact.js, integrated into React.js dashboards.


- Collaborate with AI/ML engineers to embed intelligent feedback and behavioral analysis.


- Build secure, multi-tenant systems with role-based access control (RLS).


- Optimize performance for real-time audio analysis and transcript synchronization.


- Participate in agile grooming sessions and contribute to architectural decisions.


Required Skills :


- Experience with React.js or similar annotation/redaction libraries.


- Strong understanding of RESTful APIs, React.js, and Material-UI.


- Familiarity with Azure services, SQL, and authentication protocols (SSO, JWT).


- Experience with secure session management and data protection standards.


Preferred Qualifications :


- Exposure to AI/ML workflows and Python-based services.


- Experience with Livekit or similar real-time communication platforms.


- Familiarity with Power BI and accessibility standards (WCGA).


Soft Skills :


- Problem-solving mindset and adaptability.


- Ability to work independently and meet tight deadlines.

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
9 - 12 yrs
₹53L - ₹70L / yr
skill iconJava
Microservices
CI/CD
MySQL
Scripting
+5 more

JOB DETAILS:

* Job Title: Engineering Manager

* Industry: Technology

* Salary: Best in Industry

* Experience: 9-12 years

* Location: Bengaluru

* Education: B.Tech in computer science or related field from Tier 1, Tier 2 colleges


Role & Responsibilities

We are seeking a visionary and decisive Engineering Manager to join our dynamic team. In this role, you will lead and inspire a talented team of software engineers, driving innovation and excellence in product development efforts. This is an exciting opportunity to influence and shape the future of our engineering organization.

 

Key Responsibilities-

As an Engineering Manager, you will be responsible for managing the overall software development life cycle of one product. You will work and manage a cross-functional team consisting of Backend Engineers, Frontend Engineers, QA, SDET, Product Managers, Product Designers, Technical Project Managers, Data Scientists, etc.

  • Responsible for mapping business objectives to an optimum engineering structure, including correct estimation of resource allocation.
  • Responsible for key technical and product decisions. Provide direction and mentorship to the team. Set up best practices for engineering.
  • Work closely with the Product Manager and help them in getting relevant inputs from the engineering team.
  • Plan and track the development and release schedules, proactively assess and mitigate risks. Prepare for contingencies and provide visible leadership in crisis.
  • Conduct regular 1:1s for performance feedback and lead their appraisals.
  • Responsible for driving good coding practices in the team like good quality code, documentation, timely bug fixing, etc.
  • Report on the status of development, quality, operations, and system performance to management.
  • Create and maintain an open and transparent environment that values speed and innovation and motivates engineers to build innovative and effective systems rapidly.


Ideal Candidate

  • Strong Engineering Manager / Technical Leadership Profile
  • Must have 9+ years of experience in software engineering with experience building complex, large-scale products
  • Must have 2+ years of experience as an Engineering Manager / Tech Lead with people management responsibilities
  • Strong technical foundation with hands-on experience in Java (or equivalent compiled language), scripting languages, web technologies, and databases (SQL/NoSQL)
  • Proven ability to solve large-scale technical problems and guide teams on architecture, design, quality, and best practices
  • Experience in leading cross-functional teams, planning and tracking delivery, mentoring engineers, conducting performance reviews, and driving engineering excellence
  • Must have strong experience working with Product Managers, UX designers, QA, and other cross-functional partners
  • Excellent communication and interpersonal skills to influence technical direction and stakeholder decisions
  • (Company): Product companies
  • Must have stayed for at least 2 years with each of the previous companies
  • (Education): B.Tech in computer science or related field from Tier 1, Tier 2 colleges
Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Delhi, Kolkata, Bengaluru (Bangalore), Kochi (Cochin), Hosur, Trivandrum
7 - 9 yrs
₹5.5L - ₹20L / yr
skill icon.NET
skill iconAmazon Web Services (AWS)
skill iconC#
skill iconReact.js
SQL

Job Description -

Profile: .Net Full Stack Lead

Experience Required: 7–12 Years

Location: Pune, Bangalore, Chennai, Coimbatore, Delhi, Hosur, Hyderabad, Kochi, Kolkata, Trivandrum

Work Mode: Hybrid

Shift: Normal Shift

Key Responsibilities:

  • Design, develop, and deploy scalable microservices using .NET Core and C#
  • Build and maintain serverless applications using AWS services (Lambda, SQS, SNS)
  • Develop RESTful APIs and integrate them with front-end applications
  • Work with both SQL and NoSQL databases to optimize data storage and retrieval
  • Implement Entity Framework for efficient database operations and ORM
  • Lead technical discussions and provide architectural guidance to the team
  • Write clean, maintainable, and testable code following best practices
  • Collaborate with cross-functional teams to deliver high-quality solutions
  • Participate in code reviews and mentor junior developers
  • Troubleshoot and resolve production issues in a timely manner

Required Skills & Qualifications:

  • 7–12 years of hands-on experience in .NET development
  • Strong proficiency in .NET Framework.NET Core, and C#
  • Proven expertise with AWS services (Lambda, SQS, SNS)
  • Solid understanding of SQL and NoSQL databases (SQL Server, MongoDB, DynamoDB, etc.)
  • Experience building and deploying Microservices architecture
  • Proficiency in Entity Framework or EF Core
  • Strong knowledge of RESTful API design and development
  • Experience with React or Angular is a good to have
  • Understanding of CI/CD pipelines and DevOps practices
  • Strong debugging, performance optimization, and problem-solving skills
  • Experience with design patterns, SOLID principles, and best coding practices
  • Excellent communication and team leadership skills


Read more
Truetech solutions

Truetech solutions

Agency job
via TrueTech Solutions by Meimozhi balu
Bengaluru (Bangalore), Kochi (Cochin)
4 - 15 yrs
₹10L - ₹25L / yr
skill icon.NET
ASP.NET
skill iconAmazon Web Services (AWS)
Amazon EC2
AWS Lambda
+2 more

• Minimum 4+ years of years

• Experience in designing, developing, and maintain backend services using C# 12 and .NET 8 or .NET 9

• Experience in building and operating cloud native and serverless applications on AWS

• Experience in developing and integrating services using AWS lambda, API Gateway , dynamo DB, Eventbridge, CloudWatch, SQS, SNS, Kinesis, Secret Manager, S3 storage, server architectural models etc.

Experience in integrating services using AWS SDK

• Should be cognizant of the OMS paradigms including Inventory Management, Inventory publish, supply feed processing, control mechanisms, ATP publish, Order Orchestration, workflow set up and customizations, integrations with tax, AVS, payment engines, sourcing algorithms and managing reservations with back orders, schedule mechanisms, flash sales management etc.

• Should have a decent End to End knowledge of various Commerce subsystems which include Storefront, Core Commerce back end, Post Purchase processing, OMS, Store / Warehouse Management processes, Supply Chain and Logistic processes. This is to ascertain candidates knowhow on the overall Retail landscape of any customer.

• Strong knowledge in Querying in Oracle DB and SQL Server

• Able to read, write and manage PLSQL procedures in oracle.

• Strong debugging, performance tuning and problem solving skills

• Experience with event driven and micro services architectures

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
9 - 12 yrs
₹50L - ₹70L / yr
skill iconJava
Microservices
CI/CD
MySQL
MySQL DBA
+9 more

Job Details

- Job Title: Staff Engineer

Industry: Technology

Domain - Information technology (IT)

Experience Required: 9-12 years

Employment Type: Full Time

Job Location: Bengaluru

CTC Range: Best in Industry

 

Role & Responsibilities

As a Staff Engineer at company, you will play a critical role in defining and driving our backend architecture as we scale globally. You’ll own key systems that handle high volumes of data and transactions, ensuring performance, reliability, and maintainability across distributed environments.

 

Key Responsibilities-

  • Own one or more core applications end-to-end, ensuring reliability, performance, and scalability.
  • Lead the design, architecture, and development of complex, distributed systems, frameworks, and libraries aligned with company’s technical strategy.
  • Drive engineering operational excellence by defining robust roadmaps for system reliability, observability, and performance improvements.
  • Analyze and optimize existing systems for latency, throughput, and efficiency, ensuring they perform at scale.
  • Collaborate cross-functionally with Product, Data, and Infrastructure teams to translate business requirements into technical deliverables.
  • Mentor and guide engineers, fostering a culture of technical excellence, ownership, and continuous learning.
  • Establish and uphold coding standards, conduct design and code reviews, and promote best practices across teams.
  • Stay ahead of the curve on emerging technologies, frameworks, and patterns to strengthen company’s technology foundation.
  • Contribute to hiring by identifying and attracting top-tier engineering talent.

 

Ideal Candidate

  • Strong staff engineer profile
  • Must have 9+ years in backend engineering with Java, Spring/Spring Boot, and microservices building large and schalable systems
  • Must have been SDE-3 / Tech Lead / Lead SE for at least 2.5 years
  • Strong in DSA, system design, design patterns, and problem-solving
  • Proven experience building scalable, reliable, high-performance distributed systems
  • Hands-on with SQL/NoSQL databases, REST/gRPC APIs, concurrency & async processing
  • Experience in AWS/GCP, CI/CD pipelines, and observability/monitoring
  • Excellent ability to explain complex technical concepts to varied stakeholders
  • Product companies (B2B SAAS preferred)
  • Must have stayed for at least 2 years with each of the previous companies
  • (Education): B.Tech in computer science from Tier 1, Tier 2 colleges


Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
2 - 5 yrs
₹4L - ₹5L / yr
DevOps
Windows Azure
CI/CD
MySQL
skill iconPython
+12 more

JOB DETAILS:

* Job Title: DevOps Engineer (Azure)

* Industry: Technology

* Salary: Best in Industry

* Experience: 2-5 years

* Location: Bengaluru, Koramangala

Review Criteria

  • Strong Azure DevOps Engineer Profiles.
  • Must have minimum 2+ years of hands-on experience as an Azure DevOps Engineer with strong exposure to Azure DevOps Services (Repos, Pipelines, Boards, Artifacts).
  • Must have strong experience in designing and maintaining YAML-based CI/CD pipelines, including end-to-end automation of build, test, and deployment workflows.
  • Must have hands-on scripting and automation experience using Bash, Python, and/or PowerShell
  • Must have working knowledge of databases such as Microsoft SQL Server, PostgreSQL, or Oracle Database
  • Must have experience with monitoring, alerting, and incident management using tools like Grafana, Prometheus, Datadog, or CloudWatch, including troubleshooting and root cause analysis

 

Preferred

  • Knowledge of containerisation and orchestration tools such as Docker and Kubernetes.
  • Knowledge of Infrastructure as Code and configuration management tools such as Terraform and Ansible.
  • Preferred (Education) – BE/BTech / ME/MTech in Computer Science or related discipline

 

Role & Responsibilities

  • Build and maintain Azure DevOps YAML-based CI/CD pipelines for build, test, and deployments.
  • Manage Azure DevOps Repos, Pipelines, Boards, and Artifacts.
  • Implement Git branching strategies and automate release workflows.
  • Develop scripts using Bash, Python, or PowerShell for DevOps automation.
  • Monitor systems using Grafana, Prometheus, Datadog, or CloudWatch and handle incidents.
  • Collaborate with dev and QA teams in an Agile/Scrum environment.
  • Maintain documentation, runbooks, and participate in root cause analysis.

 

Ideal Candidate

  • 2–5 years of experience as an Azure DevOps Engineer.
  • Strong hands-on experience with Azure DevOps CI/CD (YAML) and Git.
  • Experience with Microsoft Azure (OCI/AWS exposure is a plus).
  • Working knowledge of SQL Server, PostgreSQL, or Oracle.
  • Good scripting, troubleshooting, and communication skills.
  • Bonus: Docker, Kubernetes, Terraform, Ansible experience.
  • Comfortable with WFO (Koramangala, Bangalore).


Read more
Auxo AI
Kritika Dhingra
Posted by Kritika Dhingra
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
2 - 8 yrs
₹10L - ₹30L / yr
skill iconAmazon Web Services (AWS)
Data Transformation Tool (DBT)
SQL
skill iconPython
Spark
+1 more

AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The ideal candidate will have 3-7 years of prior experience in data engineering, with a strong background in working on modern data platforms. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.


Location : Bangalore, Hyderabad, Mumbai, and Gurgaon


Responsibilities:

· Designing, building, and operating scalable on-premises or cloud data architecture

· Analyzing business requirements and translating them into technical specifications

· Design, develop, and implement data engineering solutions using DBT on cloud platforms (Snowflake, Databricks)

· Design, develop, and maintain scalable data pipelines and ETL processes

· Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.

· Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness

· Implement data governance and security best practices to ensure compliance and data integrity

· Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring

· Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.


Requirements


· Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

· Overall 3+ years of prior experience in data engineering, with a focus on designing and building data pipelines

· Experience of working with DBT to implement end-to-end data engineering processes on Snowflake and Databricks

· Comprehensive understanding of the Snowflake and Databricks ecosystem

· Strong programming skills in languages like SQL and Python or PySpark.

· Experience with data modeling, ETL processes, and data warehousing concepts.

· Familiarity with implementing CI/CD processes or other orchestration tools is a plus.


Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 8 yrs
₹26L - ₹35L / yr
skill iconPython
skill iconJava
SQL
FastAPI
skill iconDjango
+5 more

Review Criteria

  • Strong Senior Backend Engineer profiles
  • Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
  • Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
  • Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
  • Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
  • Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
  • Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
  • (Company) – Must have worked in product companies / startups, preferably Series A to Series D
  • (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred

 

Role & Responsibilities

As a Founding Engineer at company, you'll join our engineering team during an exciting growth phase, contributing to a platform that handles complex financial operations for B2B companies. You'll work on building scalable systems that automate billing, usage metering, revenue recognition, and financial reporting—directly impacting how businesses manage their revenue operations.

This role is perfect for someone who thrives in a dynamic startup environment where requirements evolve quickly and problems need creative solutions. You'll work on diverse technical challenges, from API development to external integrations, while collaborating with senior engineers, product managers, and customer success teams.

 

Key Responsibilities-

  • Build core platform features: Develop robust APIs, services, and integrations that power company’s billing automation and revenue recognition capabilities
  • Work across the full stack: Contribute to both backend services and frontend interfaces, ensuring seamless user experiences
  • Implement critical integrations: Connect company with external systems including CRMs, data warehouses, ERPs, and payment processors
  • Optimize for scale: Build systems that handle complex pricing models, high-volume usage data, and real-time financial calculations
  • Drive quality and best practices: Write clean, maintainable code while participating in code reviews and architectural discussions
  • Solve complex problems: Debug issues across the stack and work closely with teams to address evolving client needs

 

The Impact You'll Make-

  • Power business growth: Your code will directly enable billing and revenue operations for fast-growing B2B companies, helping them scale without operational bottlenecks
  • Build critical financial infrastructure: Contribute to systems handling millions in transactions while ensuring accurate, compliant revenue recognition
  • Shape product direction: Join during our scaling phase where your contributions immediately impact product evolution and customer success
  • Accelerate your expertise: Gain deep knowledge in financial systems, B2B SaaS operations, and enterprise software while working with industry veterans
  • Drive the future of B2B commerce: Help create infrastructure powering next-generation pricing models from usage-based to value-based billing.

 

 

Read more
Bengaluru (Bangalore)
5 - 8 yrs
₹27L - ₹40L / yr
skill iconPython
skill iconJava
SQL

Strong Senior Backend Engineer profiles

Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems

Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).

Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework

Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization

Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices

Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)

Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D

Mandatory (Education) – Candidates from top engineering institutes (IITs, BITS, or equivalent Tier-1 colleges) are preferred

Read more
Estuate Software

at Estuate Software

1 candid answer
Ariba Khan
Posted by Ariba Khan
Remote, Bengaluru (Bangalore)
8 - 12 yrs
Upto ₹30L / yr (Varies
)
SQL
confluence
Business Analysis
User Research

About the company:

At Estuate, more than 400 uniquely talented people work together, to provide the world with next-generation product engineering and IT enterprise services. We help companies reimagine their business for the digital age.

Incorporated in 2005 in Milpitas (CA), we have grown to become a global organization with a truly global vision. At Estuate, we bring together talent, experience, and technology to meet our customer’s needs. Our ‘Extreme Service’ culture helps us deliver extraordinary results.


Our key to success:

We are an ISO-certified organization present across four distinct global geographies. We cater to industry verticals such as BFSI, Healthcare & Pharma, Retail & E-Commerce, and ISVs/Startups, as well as having over 2,000 projects in our portfolio.

Our solution-oriented mindset fuels our offerings, including Platform Engineering, Business Apps, and Enterprise Security & GRC.


Our culture of oneness

At Estuate, we are committed to fostering an inclusive workplace that welcomes people from diverse social circumstances. Our diverse culture shapes our success stories. Our values unite us. And, our curiosity inspires our creativity. Now, if that sounds like the place you’d like to be, we look forward to hearing more from you.


Requirements:

Technical skills

  • 8+ years of experience in a role Business or System or Functional Analyst;
  • Proficient in writing User Stories, Use Cases, Functional and Non-Functional requirements, system diagrams, wireframes;
  • Experience of working with Restful APIs (writing requirements, API usage);
  • Experience in Microservices architecture;
  • Experience of working with Agile methodologies (Scrum, Kanban);
  • Knowledge of SQL;
  • Knowledge of UML, BPMN;
  • Understanding of key UX/UI practices and processes;
  • Understanding of software development lifecycle;
  • Understanding of architecture of WEB-based application;
  • English Upper-Intermediate or higher.

 

Soft Skills

  • Excellent communication and presentation skills;
  • Proactiveness;
  • Organized, detail-oriented with ability to keep overall solution in mind;
  • Comfort working in a fast-paced environment, running concurrent projects and manage BA work with multiple stakeholders;
  • Good time-management skills, ability to handle multitasking activities.


Good to haves

  • Experience in enterprise software development or finance domain;
  • Experience in delivery of desktop and web-applications;
  • Experience of successful system integration project.

 

Responsibilities:

  • Participation in discovery phases and workshops with Customer, covering key business and product requirements;
  • Manage project scope, requirements management and their impact on existing requirements, defining dependencies on other teams;
  • Creating business requirements, user stories, mockups, functional specifications and technical requirements (incl. flow diagrams, data mappings, examples);
  • Close collaboration with development team (requirements presentation, backlog grooming, requirements change management, technical solution design together with Tech Lead, etc.);
  • Regular communication with internal (Product, Account management, Business teams) and external stakeholders (Partners, Customers);
  • Preparing UAT scenarios, validation cases;
  • User Acceptance Testing;
  • Demo for internal stakeholders;
  • Creating documentation (user guides, technical guides, presentations).

Project Description:

Wireless Standard POS (Point-Of-Sales) is our retail management solution for the Telecom Market.

It provides thousands of retailers with features and functionalities they need to run their businesses effectively with full visibility and control into every aspect of sales and operations. It is simple to learn, easy to use and as operation grows, more features can be added on.


Our system can optimize and simplify all processes related to retail in this business area.

Few things that our product can do:

  • Robust Online Reporting
  • Repair Management Software
  • 3rd Party Integrations
  • Customer Connect Marketing
  • Time and Attendance
  • Carrier Commission Reconciliation

 

 As a Business Analyst/ System Analyst, you will be the liaison between the lines of business and the Development team, have the opportunity to work on a very complex product with microservice architecture (50+ for now) and communicate with Product, QA, Developers, Architecture and Customer Support teams to help improve product quality.


Read more
Euphoric Thought Technologies
Bengaluru (Bangalore)
3 - 4 yrs
₹6L - ₹14L / yr
SQL
skill iconPython

Job Description:

Summary

The Data Engineer will be responsible for designing, developing, and maintaining the data infrastructure. They must have experience with SQL and Python.

Roles & Responsibilities:

● Collaborate with product, business, and engineering stakeholders to understand key metrics, data needs, and reporting pain points.

● Design, build, and maintain clean, scalable, and reliable data models using DBT.

● Write performant SQL and Python code to transform raw data into structured marts and reporting layers.

● Create dashboards using Tableau or similar tools.

● Work closely with data platform engineers, architects, and analysts to ensure data pipelines are resilient, well-governed, and high quality.

● Define and maintain source-of-truth metrics and documentation in the analytics layer.

● Partner with product engineering teams to understand new features and ensure appropriate

instrumentation and event collection.

● Drive reporting outcomes by building dashboards or working with BI teams to ensure timely delivery of insights.

● Help scale our analytics engineering practice by contributing to internal tooling, frameworks, and best practices.

Who You Are:

Experience : 3 to 4 years of experience in analytics/data engineering, with strong hands-on expertise in DBT, SQL, Python and dashboarding tools.

● Experience working with modern data stacks (e.g., Snowflake, BigQuery, Redshift, Airflow).

● Strong data modeling skills (dimensional, star/snowflake schema, data vault, etc.).

● Excellent communication and stakeholder management skills.

● Ability to work independently and drive business outcomes through data.

● Exposure to product instrumentation and working with event-driven data is a plus.

● Prior experience in a fast-paced, product-led company is preferred.

Read more
Euphoric Thought Technologies
Remote, Bengaluru (Bangalore)
3 - 4 yrs
₹11L - ₹13L / yr
skill iconPython
SQL

We are seeking a Data Engineer with 3–4 years of relevant experience to join our team. The ideal candidate should have strong expertise in Python and SQL and be available to join immediately.

Location: Bangalore

Experience: 3–4 Years

Joining: Immediate Joiner preferred

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and data models
  • Extract, transform, and load (ETL) data from multiple sources
  • Write efficient and optimized SQL queries for data analysis and reporting
  • Develop data processing scripts and automation using Python
  • Ensure data quality, integrity, and performance across systems
  • Collaborate with cross-functional teams to support business and analytics needs
  • Troubleshoot data-related issues and optimize existing processes

Required Skills & Qualifications:

  • 3–4 years of hands-on experience as a Data Engineer or similar role
  • Strong proficiency in Python and SQL
  • Experience working with relational databases and large datasets
  • Good understanding of data warehousing and ETL concepts
  • Strong analytical and problem-solving skills
  • Ability to work independently and in a team-oriented environment

Preferred:

  • Experience with cloud platforms or data tools (added advantage)
  • Exposure to performance tuning and data optimization





Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹12L / yr
skill iconData Analytics
SQL

Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.

Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.

Handling the challenges and breaking.

Must have Advanced Excel skills

Should have 3+ years of relevant experience

Should have Reporting + dashboard creation experience

Should have Database development & maintenance experience

Must have Strong communication for client interactions

Should have Ability to work independently

Willingness to work from client locati

Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 7 yrs
₹14L - ₹20L / yr
skill iconPython
Mainframe
skill iconC#
SDET
Test Automation (QA)
+37 more

Job Details

Job Title: Java Full Stack Developer 

Industry: Global digital transformation solutions provider

- Domain: Information technology (IT)

Experience Required: 5-7 years

Working Mode: 3 days in office, Hybrid model.

Job Location: Bangalore

CTC Range: Best in Industry


Job Description:

SDET (Software Development Engineer in Test)


Job Responsibilities:

• Test Automation: • Develop, maintain, and execute automated test scripts using test automation frameworks. • Design and implement testing tools and frameworks to support automated testing.

• Software Development: • Participate in the design and development of software components to improve testability. • Write code actively, contribute to the development of tools, and work closely with developers to debunk complex issues.

• Quality Assurance: • Collaborate with the development team to understand software features and technical implementations. • Develop quality assurance standards and ensure adherence to the best testing practices.

• Integration Testing: • Conduct integration and functional testing to ensure that components work as expected individually and when combined.

• Performance and Scalability Testing: • Perform performance and scalability testing to identify bottlenecks and optimize application performance. • Test Planning and Execution: • Create detailed, comprehensive, and well-structured test plans and test cases. • Execute manual and/or automated tests and analyze results to ensure product quality.

• Bug Tracking and Resolution: • Identify, document, and track software defects using bug tracking tools. • Verify fixes and work closely with developers to resolve issues. • Continuous Improvement: • Stay updated on emerging tools and technologies relevant to the SDET role. • Constantly look for ways to improve testing processes and frameworks.


Skills and Qualifications: • Strong programming skills, particularly in languages such as COBOL, JCL, Java, C#, Python, or JavaScript. • Strong experience in Mainframe environments. • Experience with test automation tools and frameworks like Selenium, JUnit, TestNG, or Cucumber. • Excellent problem-solving skills and attention to detail. • Familiarity with CI/CD tools and practices, such as Jenkins, Git, Docker, etc. • Good understanding of web technologies and databases is often beneficial. • Strong communication skills for interfacing with cross-functional teams.


Qualifications • 5+ years of experience as a software developer, QA Engineer, or SDET. • 5+ years of hands-on experience with Java or Selenium. • 5+ years of hands-on experience with Mainframe environments. • 4+ years designing, implementing, and running test cases. • 4+ years working with test processes, methodologies, tools, and technology. • 4+ years performing functional and UI testing, quality reporting. • 3+ years of technical QA management experience leading on and offshore resources. • Passion around driving best practices in the testing space. • Thorough understanding of Functional, Stress, Performance, various forms of regression testing and mobile testing. • Knowledge of software engineering practices and agile approaches. • Experience building or improving test automation frameworks. • Proficiency CICD integration and pipeline development in Jenkins, Spinnaker or other similar tools. • Proficiency in UI automation (Serenity/Selenium, Robot, Watir). • Experience in Gherkin (BDD /TDD). • Ability to quickly tackle and diagnose issues within the quality assurance environment and communicate that knowledge to a varied audience of technical and non-technical partners. • Strong desire for establishing and improving product quality. • Willingness to take challenges head on while being part of a team. • Ability to work under tight deadlines and within a team environment. • Experience in test automation using UFT and Selenium. • UFT/Selenium experience in building object repositories, standard & custom checkpoints, parameterization, reusable functions, recovery scenarios, descriptive programming and API testing. • Knowledge of VBScript, C#, Java, HTML, and SQL. • Experience using GIT or other Version Control Systems. • Experience developing, supporting, and/or testing web applications. • Understanding of the need for testing of security requirements. • Ability to understand API – JSON and XML formats with experience using API testing tools like Postman, Swagger or SoapUI. • Excellent communication, collaboration, reporting, analytical and problem-solving skills. • Solid understanding of Release Cycle and QA /testing methodologies • ISTQB certification is a plus.


Skills: Python, Mainframe, C#

Notice period - 0 to 15days only

Read more
Sim Gems Group

at Sim Gems Group

4 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 10 yrs
Upto ₹25L / yr (Varies
)
skill iconPython
Odoo (OpenERP)
SQL
skill iconKubernetes
Data Structures

Employment Type: Full-time, Permanent

Location: Near Bommasandra Metro Station, Bangalore (Work from Office – 5 days/week)

Notice Period: 15 days or less preferred


About the Company:

SimStar Asia Ltd is a joint vendor of the SimGems and StarGems Group — a Hong Kong–based multinational organization engaged in the global business of conflict-free, high-value diamonds.

SimStar maintains the highest standards of integrity. Any candidate found engaging in unfair practices at any stage of the interview process will be disqualified and blacklisted.


Experience Required

  • 4+ years of relevant professional experience.

Key Responsibilities

  • Hands-on backend development using Python (mandatory).
  • Write optimized and complex SQL queries; perform query tuning and performance optimization.
  • Work extensively with the Odoo framework, including development and deployment.
  • Manage deployments using Docker and/or Kubernetes.
  • Develop frontend components using OWL.js or any modern JavaScript framework.
  • Design scalable systems with a strong foundation in Data Structures, Algorithms, and System Design.
  • Handle API integrations and data exchange between systems.
  • Participate in technical discussions and architecture decisions.

Interview Expectations

  • Candidates must be comfortable writing live code during interviews.
  • SQL queries and optimization scenarios will be part of the technical assessment.

Must-Have Skills

  • Python backend development
  • Advanced SQL
  • Odoo Framework & Deployment
  • Docker / Kubernetes
  • JavaScript frontend (OWL.js preferred)
  • System Design fundamentals
  • API integration experience


Read more
Inferigence Quotient

at Inferigence Quotient

1 recruiter
Neeta Trivedi
Posted by Neeta Trivedi
Bengaluru (Bangalore)
1 - 2 yrs
₹7L - ₹12L / yr
skill iconPython
FastAPI
skill iconMongoDB
NOSQL Databases
SQL

We are looking for a Python Backend Developer to design, build, and maintain scalable backend services and APIs. The role involves working with modern Python frameworks, databases (SQL and NoSQL), and building well-tested, production-grade systems.


You will collaborate closely with frontend developers, AI/ML engineers, and system architects to deliver reliable and high-performance backend solutions.


Key Responsibilities

  • Design, develop, and maintain backend services using Python
  • Build and maintain RESTful APIs using FastAPI
  • Design efficient data models and queries using MongoDB and SQL databases (PostgreSQL/MySQL)
  • Ensure high performance, security, and scalability of backend systems
  • Write unit tests, integration tests, and API tests to ensure code reliability
  • Debug, troubleshoot, and resolve production issues
  • Follow clean code practices, documentation, and version control workflows
  • Participate in code reviews and contribute to technical discussions
  • Work closely with cross-functional teams to translate requirements into technical solutions


Required Skills & Qualifications

Technical Skills

  • Strong proficiency in Python
  • Hands-on experience with FastAPI
  • Experience with MongoDB (schema design, indexing, aggregation)
  • Solid understanding of SQL databases and relational data modelling
  • Experience writing and maintaining automated tests
  • Unit testing (e.g., pytest)
  • API testing
  • Understanding of REST API design principles
  • Familiarity with Git and collaborative development workflows

Good to Have

  • Experience with async programming in Python (async/await)
  • Knowledge of ORMs/ODMs (SQLAlchemy, Tortoise, Motor, etc.)
  • Basic understanding of authentication & authorisation (JWT, OAuth)
  • Exposure to Docker / containerised environments
  • Experience working in Agile/Scrum teams

What We Value

  • Strong problem-solving and debugging skills
  • Attention to detail and commitment to quality
  • Ability to write testable, maintainable, and well-documented code
  • Ownership mindset and willingness to learn
  • Teamwork

What We Offer

  • Opportunity to work on real-world, production systems
  • Technically challenging problems and ownership of components
  • Collaborative engineering culture
Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
6 - 9 yrs
₹36L - ₹48L / yr
skill iconPython
TypeScript
skill iconNodeJS (Node.js)
ReAct (Reason + Act)
skill iconReact Native
+13 more

Review Criteria:

  • Strong Software Engineer fullstack profile using NodeJS / Python and React
  • 6+ YOE in Software Development using Python OR NodeJS (For backend) & React (For frontend)
  • Must have strong experience in working on Typescript
  • Must have experience in message-based systems like Kafka, RabbitMq, Redis
  • Databases - PostgreSQL & NoSQL databases like MongoDB
  • Product Companies Only
  • Tier 1 Engineering Institutes preferred (IIT, NIT, BITS, IIIT, DTU or equivalent)

 

Preferred:

  • Experience in Fin-Tech, Payment, POS and Retail products is highly preferred
  • Experience in mentoring, coaching the team.


Role & Responsibilities:

We are currently seeking a Senior Engineer to join our Financial Services team, contributing to the design and development of scalable system.

 

The Ideal Candidate Will Be Able To-

  • Take ownership of delivering performant, scalable and high-quality cloud-based software, both frontend and backend side.
  • Mentor team members to develop in line with product requirements.
  • Collaborate with Senior Architect for design and technology choices for product development roadmap.
  • Do code reviews.


Ideal Candidate:

  • Thorough knowledge of developing cloud-based software including backend APIs and react based frontend.
  • Thorough knowledge of scalable design patterns and message-based systems such as Kafka, RabbitMq, Redis, MongoDB, ORM, SQL etc.
  • Experience with AWS services such as S3, IAM, Lambda etc.
  • Expert level coding skills in Python FastAPI/Django, NodeJs, TypeScript, ReactJs.
  • Eye for user responsive designs on the frontend.


Read more
Tamashalive

at Tamashalive

1 candid answer
Aparna. Majumder
Posted by Aparna. Majumder
Bengaluru (Bangalore)
3 - 5 yrs
₹12L - ₹18L / yr
skill iconPostman
SQL
CI/CD
Appium
Proxies
+3 more

Role Overview:

We are looking for a detail-oriented Quality Assurance (QA) Tester who is

passionate about delivering high-quality consumer-facing applications. This role

involves manual testing with exposure to automation, API testing, databases, and

mobile/web platforms, while working closely with engineering and product teams

across the SDLC.

Products:

• Openly – A conversation-first social app focused on meaningful interactions.

• Playroom – Voicechat – A real-time voice chat platform for live community

engagement.

• FriendChat – A chatroom-based social app for discovering and connecting with

new people.

Key Responsibilities:

• Perform manual testing for Android, web, and native applications.

• Create and execute detailed test scenarios, test cases, and test plans.

• Conduct REST API testing using Postman.

• Validate data using SQL and MongoDB.

• Identify, report, and track defects with clear reproduction steps.

• Support basic automation testing using Selenium (Java) and Appium.

• Perform regression, smoke, sanity, and exploratory testing.

• Conduct risk analysis and highlight quality risks early in the SDLC.• Collaborate closely with developers and product teams for defect resolution.

• Participate in CI/CD pipelines and support automated test executions.

• Use ADB tools for Android testing across devices and environments.

Required Skills & Technical Expertise:

• Strong knowledge of Manual Testing fundamentals.

• Hands-on experience with Postman and REST APIs.

• Working knowledge of SQL and MongoDB.

• Ability to design effective test scenarios.

• Basic understanding of Automation Testing concepts.

• Familiarity with SDLC and QA methodologies.

• Exposure to Selenium with Java and Appium.

• Understanding of Android, web, and native application testing.

• Experience using proxy tools for debugging and network inspection.

Good to Have:

• Exposure to CI/CD tools and pipelines.

• Hands-on experience with Appium, K6, Kafka, and proxy tools.

• Basic understanding of performance and load testing.

• Awareness of risk-based testing strategies.

Key Traits:

• High attention to detail and quality.

• Strong analytical and problem-solving skills.

• Clear communication and collaboration abilities.

• Eagerness to learn and grow in automation and advanced testing tools.












Read more
Quantiphi

at Quantiphi

3 candid answers
1 video
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore), Mumbai, Trivandrum
4 - 7 yrs
Upto ₹30L / yr (Varies
)
Google Cloud Platform (GCP)
SQL
ETL
Datawarehousing
Data-flow analysis

We are looking for a skilled Data Engineer / Data Warehouse Engineer to design, develop, and maintain scalable data pipelines and enterprise data warehouse solutions. The role involves close collaboration with business stakeholders and BI teams to deliver high-quality data for analytics and reporting.


Key Responsibilities

  • Collaborate with business users and stakeholders to understand business processes and data requirements
  • Design and implement dimensional data models, including fact and dimension tables
  • Identify, design, and implement data transformation and cleansing logic
  • Build and maintain scalable, reliable, and high-performance ETL/ELT pipelines
  • Extract, transform, and load data from multiple source systems into the Enterprise Data Warehouse
  • Develop conceptual, logical, and physical data models, including metadata, data lineage, and technical definitions
  • Design, develop, and maintain ETL workflows and mappings using appropriate data load techniques
  • Provide high-level design, research, and effort estimates for data integration initiatives
  • Provide production support for ETL processes to ensure data availability and SLA adherence
  • Analyze and resolve data pipeline and performance issues
  • Partner with BI teams to design and develop reports and dashboards while ensuring data integrity and quality
  • Translate business requirements into well-defined technical data specifications
  • Work with data from ERP, CRM, HRIS, and other transactional systems for analytics and reporting
  • Define and document BI usage through use cases, prototypes, testing, and deployment
  • Support and enhance data governance and data quality processes
  • Identify trends, patterns, anomalies, and data quality issues, and recommend improvements
  • Train and support business users, IT analysts, and developers
  • Lead and collaborate with teams spread across multiple locations

Required Skills & Qualifications

  • Bachelor’s degree in Computer Science or a related field, or equivalent work experience
  • 3+ years of experience in Data Warehousing, Data Engineering, or Data Integration
  • Strong expertise in data warehousing concepts, tools, and best practices
  • Excellent SQL skills
  • Strong knowledge of relational databases such as SQL Server, PostgreSQL, and MySQL
  • Hands-on experience with Google Cloud Platform (GCP) services, including:
  1. BigQuery
  2. Cloud SQL
  3. Cloud Composer (Airflow)
  4. Dataflow
  5. Dataproc
  6. Cloud Functions
  7. Google Cloud Storage (GCS)
  • Experience with Informatica PowerExchange for Mainframe, Salesforce, and modern data sources
  • Strong experience integrating data using APIs, XML, JSON, and similar formats
  • In-depth understanding of OLAP, ETL frameworks, Data Warehousing, and Data Lakes
  • Solid understanding of SDLC, Agile, and Scrum methodologies
  • Strong problem-solving, multitasking, and organizational skills
  • Experience handling large-scale datasets and database design
  • Strong verbal and written communication skills
  • Experience leading teams across multiple locations

Good to Have

  • Experience with SSRS and SSIS
  • Exposure to AWS and/or Azure cloud platforms
  • Experience working with enterprise BI and analytics tools

Why Join Us

  • Opportunity to work on large-scale, enterprise data platforms
  • Exposure to modern cloud-native data engineering technologies
  • Collaborative environment with strong stakeholder interaction
  • Career growth and leadership opportunities
Read more
Bengaluru (Bangalore)
2 - 8 yrs
₹4L - ₹15L / yr
skill iconPython
skill iconDjango
skill iconNextJs (Next.js)
skill iconReact.js
SQL
+4 more

Full‑Stack Engineer (Python/Django & Next.js)

Location: Bangalore

Experience: 2–8 years of hands‑on full‑stack development


We’re looking for a passionate Full‑Stack Engineer to join our team and help build secure, scalable systems that power exceptional customer experiences.


Key Skills -

• Architect and develop secure, scalable applications

• Collaborate closely with product & design teams

• Manage CI/CD pipelines and deployments

• Mentor engineers and enforce coding best practices


What we’re looking for:

• Strong expertise in Python/Django & Next.js/React

• Hands‑on with PostgreSQL, Docker, AWS/GCP

• Experience leading engineering teams

• Excellent problem‑solving & communication skills


If you’re excited about building impactful products and driving engineering excellence. Apply now !!

Read more
AryuPay Technologies
Bhavana Chaudhari
Posted by Bhavana Chaudhari
Bengaluru (Bangalore)
4 - 8 yrs
₹4L - ₹9L / yr
skill iconDjango
skill iconPython
RESTful APIs
skill iconFlask
skill iconPostgreSQL
+7 more

We are seeking a highly skilled and experienced Python Developer with a strong background in fintech to join our dynamic team. The ideal candidate will have at least 7+ years of professional experience in Python development, with a proven track record of delivering high-quality software solutions in the fintech industry.

Responsibilities:

Design, build, and maintain RESTful APIs using Django and Django Rest Framework.

Integrate AI/ML models into existing applications to enhance functionality and provide data-driven insights.

Collaborate with cross-functional teams, including product managers, designers, and other developers, to define and implement new features and functionalities.

Manage deployment processes, ensuring smooth and efficient delivery of applications.

Implement and maintain payment gateway solutions to facilitate secure transactions.

Conduct code reviews, provide constructive feedback, and mentor junior members of the development team.

Stay up-to-date with emerging technologies and industry trends, and evaluate their potential impact on our products and services.

Maintain clear and comprehensive documentation for all development processes and integrations.

Requirements:

Proficiency in Python and Django/Django Rest Framework.

Experience with REST API development and integration.

Knowledge of AI/ML concepts and practical experience integrating AI/ML models.

Hands-on experience with deployment tools and processes.

Familiarity with payment gateway integration and management.

Strong understanding of database systems (SQL, PostgreSQL, MySQL).

Experience with version control systems (Git).

Strong problem-solving skills and attention to detail.

Excellent communication and teamwork skills.

Job Types: Full-time, Permanent

Work Location: In person

Read more
MIC Global

at MIC Global

3 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
10yrs+
Upto ₹50L / yr (Varies
)
SQL
skill iconPython
PowerBI
Stakeholder management
skill iconData Analytics
+4 more

About Us

MIC Global is a full-stack micro-insurance provider, purpose-built to design and deliver embedded parametric micro-insurance solutions to platform companies. Our mission is to make insurance more accessible for new, emerging, and underserved risks using our MiIncome loss-of-income products, MiConnect, MiIdentity, Coverpoint technology, and more — backed by innovative underwriting capabilities as a Lloyd’s Coverholder and through our in-house reinsurer, MicRe.

We operate across 12+ countries, with our Global Operations Center in Bangalore supporting clients worldwide, including a leading global ride-hailing platform and a top international property rental marketplace. Our distributed teams across the UK, USA, and Asia collaborate to ensure that no one is beyond the reach of financial security.


About the Team 

As a Lead Data Specialist at MIC Global, you will play a key role in transforming data into actionable insights that inform strategic and operational decisions. You will work closely with Product, Engineering, and Business teams to analyze trends, build dashboards, and ensure that data pipelines and reporting structures are accurate, automated, and scalable.

This is a hands-on, analytical, and technically focused role ideal for someone experienced in data analytics and engineering practices. You will use SQL, Python, and modern BI tools to interpret large datasets, support pricing models, and help shape the data-driven culture across MIC Global


Key Roles and Responsibilities 

Data Analytics & Insights

  • Analyze complex datasets to identify trends, patterns, and insights that support business and product decisions.
  • Partner with Product, Operations, and Finance teams to generate actionable intelligence on customer behavior, product performance, and risk modeling.
  • Contribute to the development of pricing models, ensuring accuracy and commercial relevance.
  • Deliver clear, concise data stories and visualizations that drive executive and operational understanding.
  • Develop analytical toolkits for underwriting, pricing and claims 

Data Engineering & Pipeline Management

  • Design, implement, and maintain reliable data pipelines and ETL workflows.
  • Write clean, efficient scripts in Python for data cleaning, transformation, and automation.
  • Ensure data quality, integrity, and accessibility across multiple systems and environments.
  • Work with Azure data services to store, process, and manage large datasets efficiently.

Business Intelligence & Reporting

  • Develop, maintain, and optimize dashboards and reports using Power BI (or similar tools).
  • Automate data refreshes and streamline reporting processes for cross-functional teams.
  • Track and communicate key business metrics, providing proactive recommendations.

Collaboration & Innovation

  • Collaborate with engineers, product managers, and business leads to align analytical outputs with company goals.
  • Support the adoption of modern data tools and agentic AI frameworks to improve insight generation and automation.
  • Continuously identify opportunities to enhance data-driven decision-making across the organization.

Ideal Candidate Profile

  • 10+ years of relevant experience in data analysis or business intelligence, ideally
  • within product-based SaaS, fintech, or insurance environments.
  • Proven expertise in SQL for data querying, manipulation, and optimization.
  • Hands-on experience with Python for data analytics, automation, and scripting.
  • Strong proficiency in Power BI, Tableau, or equivalent BI tools.
  • Experience working in Azure or other cloud-based data ecosystems.
  • Solid understanding of data modeling, ETL processes, and data governance.
  • Ability to translate business questions into technical analysis and communicate findings effectively.

Preferred Attributes

  • Experience in insurance or fintech environments, especially operations, and claims analytics.
  • Exposure to agentic AI and modern data stack tools (e.g., dbt, Snowflake, Databricks).
  • Strong attention to detail, analytical curiosity, and business acumen.
  • Collaborative mindset with a passion for driving measurable impact through data.

Benefits

  • 33 days of paid holiday
  • Competitive compensation well above market average
  • Work in a high-growth, high-impact environment with passionate, talented peers
  • Clear path for personal growth and leadership development
Read more
MIC Global

at MIC Global

3 candid answers
1 product
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
5yrs+
Best in industry
skill iconPython
SQL
ETL
DBA
Windows Azure
+1 more

About Us

MIC Global is a full-stack micro-insurance provider, purpose-built to design and deliver embedded parametric micro-insurance solutions to platform companies. Our mission is to make insurance more accessible for new, emerging, and underserved risks using our MiIncome loss-of-income products, MiConnect, MiIdentity, Coverpoint technology, and more — backed by innovative underwriting capabilities as a Lloyd’s Coverholder and through our in-house reinsurer, MicRe.

We operate across 12+ countries, with our Global Operations Center in Bangalore supporting clients worldwide, including a leading global ride-hailing platform and a top international property rental marketplace. Our distributed teams across the UK, USA, and Asia collaborate to ensure that no one is beyond the reach of financial security.


About the Team 

We're seeking a mid-level Data Engineer with strong DBA experience to join our insurtech data analytics team. This role focuses on supporting various teams including infrastructure, reporting, and analytics. You'll be responsible for SQL performance optimization, building data pipelines, implementing data quality checks, and helping teams with database-related challenges. You'll work closely with the infrastructure team on production support, assist the reporting team with complex queries, and support the analytics team in building visualizations and dashboards.


Key Roles and Responsibilities 

Database Administration & Optimization

  • Support infrastructure team with production database issues and troubleshooting
  • Debug and resolve SQL performance issues, identify bottlenecks, and optimize queries
  • Optimize stored procedures, functions, and views for better performance
  • Perform query tuning, index optimization, and execution plan analysis
  • Design and develop complex stored procedures, functions, and views
  • Support the reporting team with complex SQL queries and database design

Data Engineering & Pipelines

  • Design and build ETL/ELT pipelines using Azure Data Factory and Python
  • Implement data quality checks and validation rules before data enters pipelines
  • Develop data integration solutions to connect various data sources and systems
  • Create automated data validation, quality monitoring, and alerting mechanisms
  • Develop Python scripts for data processing, transformation, and automation
  • Build and maintain data models to support reporting and analytics requirements

Support & Collaboration

  • Help data analytics team build visualizations and dashboards by providing data models and queries
  • Support reporting team with data extraction, transformation, and complex reporting queries
  • Collaborate with development teams to support application database requirements
  • Provide technical guidance and best practices for database design and query optimization

Azure & Cloud

  • Work with Azure services including Azure SQL Database, Azure Data Factory, Azure Storage, Azure Functions, and Azure ML
  • Implement cloud-based data solutions following Azure best practices
  • Support cloud database migrations and optimizations
  • Work with Agentic AI concepts and tools to build intelligent data solutions

Ideal Candidate Profile

Essential

  • 5-8 years of experience in data engineering and database administration
  • Strong expertise in MS SQL Server (2016+) administration and development
  • Proficient in writing complex SQL queries, stored procedures, functions, and views
  • Hands-on experience with Microsoft Azure services (Azure SQL Database, Azure Data Factory, Azure Storage)
  • Strong Python scripting skills for data processing and automation
  • Experience with ETL/ELT design and implementation
  • Knowledge of database performance tuning, query optimization, and indexing strategies
  • Experience with SQL performance debugging tools (XEvents, Profiler, or similar)
  • Understanding of data modeling and dimensional design concepts
  • Knowledge of Agile methodology and experience working in Agile teams
  • Strong problem-solving and analytical skills
  • Understanding of Agentic AI concepts and tools
  • Excellent communication skills and ability to work with cross-functional teams

Desirable

  • Knowledge of insurance or financial services domain
  • Experience with Azure ML and machine learning pipelines
  • Experience with Azure DevOps and CI/CD pipelines
  • Familiarity with data visualization tools (Power BI, Tableau)
  • Experience with NoSQL databases (Cosmos DB, MongoDB)
  • Knowledge of Spark, Databricks, or other big data technologies
  • Azure certifications (Azure Data Engineer Associate, Azure Database Administrator Associate)
  • Experience with version control systems (Git, Azure Repos)

Tech Stack

  • MS SQL Server 2016+, Azure SQL Database, Azure Data Factory, Azure ML, Azure Storage, Azure Functions, Python, T-SQL, Stored Procedures, ETL/ELT, SQL Performance Tools (XEvents, Profiler), Agentic AI Tools, Azure DevOps, Power BI, Agile, Git

Benefits

  • 33 days of paid holiday
  • Competitive compensation well above market average
  • Work in a high-growth, high-impact environment with passionate, talented peers
  • Clear path for personal growth and leadership development
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Archana M
Posted by Archana M
Bengaluru (Bangalore)
8 - 12 yrs
Best in industry
Distributed Systems
skill iconLeadership
LLD
HLD
DDD
+5 more

Location: Hybrid (Bangalore)

Travel: Quarterly travel to Seattle(US)

Education: B.Tech from premium institutes only

Note: Only immediate joiners required/ 0 to 15 Days — no other applications accepted.

Role Summary

We are seeking top-tier Lead Engineers who can design, build, and deliver large-scale distributed systems with high performance, reliability, and operational excellence. The ideal candidate will be a hands-on engineer with expert system design ability, deep understanding of distributed architectures, and strong communication and leadership skills.

The Lead Engineer must be able to convert complex and ambiguous requirements into a fully engineered architecture and implementation plan covering components, data flows, infrastructure, observability, and operations.

Key Responsibilities

1. End-to-End System Architecture

  • Architect scalable, reliable, and secure systems from initial concept through production rollout.
  • Define system boundaries, components, service responsibilities, and integration points.
  • Produce high-level (HLD) and low-level design (LLD) documents.
  • Ensure designs meet performance, reliability, security, and cost objectives.
  • Make informed design trade-offs with solid technical reasoning.

2. Component & Communication Design

  • Break complex systems into independently deployable services.
  • Define APIs, communication contracts, data models, and event schemas.
  • Apply modern architecture patterns such as microservices, event-driven design, DDD, CQRS, and hexagonal architecture.
  • Ensure component clarity, maintainability, and extensibility.

3. Communication Protocol & Middleware

  • Design both sync and async communication layers: REST, RPC, gRPC, message queues, event streams (Kafka/Kinesis/Pulsar).
  • Define retry/timeout strategies, circuit breakers, rate limiting, and versioning strategies.
  • Handle backpressure, partitioning, delivery semantics (at-least/at-most/exactly once).

4. Data Architecture & Storage Strategy

  • Architect data models and storage strategies for SQL and NoSQL databases, distributed caches, blob stores, and search indexes.
  • Define sharding/partitioning, replication, consistency, indexing, backup/restore, and schema evolution strategies.
  • Design real-time and batch data processing pipelines.

5. Operational Readiness

  • Define observability (metrics, logs, traces) requirements.
  • Collaborate with DevOps to ensure deployment, monitoring, alerts, and incident management readiness.
  • Provide production support as a senior technical owner.

6. Leadership & Influence

  • Lead technical discussions, design reviews, and cross-team collaboration.
  • Mentor engineers and help elevate team practices.
  • Influence technology direction and architectural standards.

Required Qualifications

  • 10+ years of professional software engineering experience with strong backend and distributed systems background.
  • Proven track record of leading large-scale architecture and delivery of production systems.
  • Expert in system design with the ability to simplify ambiguity and craft robust solutions.
  • Strong programming experience in one or more languages (Java, Go, Python, C++).
  • Deep understanding of distributed systems, message streaming, queues, RPC/REST, and event-driven architecture.
  • Experience with cloud platforms (AWS/Azure/GCP) and container technologies (Kubernetes/Docker).
  • Strong communication, documentation, and leadership skills.

Preferred Skills

  • Experience with large-scale messaging/streaming (Kafka/Pulsar), caching, and NoSQL.
  • Experience designing for high availability, fault tolerance, and performance at scale.
  • Mentoring and leading global engineering teams.
  • Familiarity with observability tooling (Grafana, Prometheus, Jaeger). 


Read more
Tradelab Technologies

at Tradelab Technologies

1 candid answer
Aakanksha Yadav
Posted by Aakanksha Yadav
Bengaluru (Bangalore)
2 - 5 yrs
₹6L - ₹15L / yr
Linux/Unix
SQL

Role Summary:

We are seeking experienced Application Support Engineers to join our client-facing support team. The ideal candidate will be the first point of contact for client issues, ensuring timely resolution, clear communication, and high customer satisfaction in a fast-paced trading environment.


Key Responsibilities:

• Act as the primary contact for clients reporting issues related to trading applications and platforms.

• Log, track, and monitor issues using internal tools and ensure resolution within defined TAT (Turnaround Time).

• Liaise with development, QA, infrastructure, and other internal teams to drive issue resolution.

• Provide clear and timely updates to clients and stakeholders regarding issue status and resolution.

• Maintain comprehensive logs of incidents, escalations, and fixes for future reference and audits.

• Offer appropriate and effective resolutions for client queries on functionality, performance, and usage.

• Communicate proactively with clients about upcoming product features, enhancements, or changes.

• Build and maintain strong relationships with clients through regular, value-added interactions.

• Collaborate in conducting UAT, release validations, and production deployment verifications.

• Assist in root cause analysis and post-incident reviews to prevent recurrences.


Required Skills & Qualifications:

• Bachelor's degree in computer science, IT, or related field.

• 2+ years in Application/Technical Support, preferably in the broking/trading domain.

• Sound understanding of capital markets – Equity, F&O, Currency, Commodities.

• Strong technical troubleshooting skills – Linux/Unix, SQL, log analysis.

• Familiarity with trading systems, RMS, OMS, APIs (REST/FIX), and order lifecycle.

• Excellent communication and interpersonal skills for effective client interaction.

• Ability to work under pressure during trading hours and manage multiple priorities.

• Customer-centric mindset with a focus on relationship building and problem-solving.


Nice to Have:

• Exposure to broking platforms like NOW, NEST, ODIN, or custom-built trading tools.

• Experience interacting with exchanges (NSE, BSE, MCX) or clearing corporations.

• Knowledge of scripting (Shell/Python) and basic networking is a plus.

• Familiarity with cloud environments (AWS/Azure) and monitoring tools.


Why Join Us?

• Be part of a team supporting mission-critical systems in real-time.

• Work in a high-energy, tech-driven environment.

• Opportunities to grow into domain/tech leadership roles.

• Competitive salary and benefits, health coverage, and employee wellness programs.

Read more
Ride-hailing Industry

Ride-hailing Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
4 - 7 yrs
₹18L - ₹21L / yr
skill iconData Analytics
skill iconPython
SQL
Data Visualization
Stakeholder management
+7 more

JOB DETAILS:

- Job Title: Senior Business Analyst

- Industry: Ride-hailing

- Experience: 4-7 years

- Working Days: 5 days/week

- Work Mode: ONSITE

- Job Location: Bangalore

- CTC Range: Best in Industry


Required Skills: Data Visualization, Data Analysis, Strong in Python and SQL, Cross-Functional Communication & Stakeholder Management


Criteria:

1. Candidate must have 4–7 years of experience in analytics / business analytics roles.

2. Candidate must be currently based in Bangalore only (no relocation allowed).

3. Candidate must have hands-on experience with Python and SQL.

4. Candidate must have experience working with databases/APIs (Mongo, Presto, REST or similar).

5. Candidate must have experience building dashboards/visualizations (Tableau, Metabase or similar).

6. Candidate must be available for face-to-face interviews in Bangalore.

7. Candidate must have experience working closely with business, product, and operations teams.


Description

Job Responsibilities:

● Acquiring data from primary/secondary data sources like mongo/presto/Rest APIs.

● Candidate must have strong hands-on experience in Python and SQL.

● Build visualizations to communicate data to key decision-makers and preferably familiar with building interactive dashboards in Tableau/Metabase

● Establish relationship between output metric and its drivers in order to identify critical drivers and control the critical drivers so as to achieve the desired value of output metric

● Partner with operations/business teams to consult, develop and implement KPIs, automated reporting/process solutions, and process improvements to meet business needs

● Collaborating with our business owners + product folks and perform data analysis of experiments and recommend the next best action for the business. Involves being embedded into business decision teams for driving faster decision making

● Collaborating with several functional teams within the organization and use raw data and metrics to back up assumptions, develop hypothesis/business cases and complete root cause analyses; thereby delivering output to business users

 

Job Requirements:

● Undergraduate and/or graduate degree in Math, Economics, Statistics, Engineering, Computer Science, or other quantitative field.

● Around 4-6 years of experience being embedded in analytics and adjacent business teams working as analyst aiding decision making

● Proficiency in Excel and ability to structure and present data in creative ways to drive insights

● Some basic understanding of (or experience in) evaluating financial parameters like return-on-investment (ROI), cost allocation, optimization, etc. is good to have

👉 ● Candidate must have strong hands-on experience in Python and SQL.

What’s there for you?

● Opportunity to understand the overall business & collaborate across all functional departments

● Prospect to disrupt the existing mobility industry business models (ideate, pilot, monitor & scale)

● Deal with the ambiguity of decision making while balancing long-term/strategic business needs and short-term/tactical moves

● Full business ownership working style which translates to freedom to pick problem statements/workflow and self-driven culture

Read more
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Agency job
via hirezyai by HR Hirezyai
Bengaluru (Bangalore), Chennai, Mumbai
5.5 - 12 yrs
₹15L - ₹25L / yr
skill iconAmazon Web Services (AWS)
PySpark
SQL

Roles & Responsibilities

  • Data Engineering Excellence: Design and implement data pipelines using formats like JSON, Parquet, CSV, and ORC, utilizing batch and streaming ingestion.
  • Cloud Data Migration Leadership: Lead cloud migration projects, developing scalable Spark pipelines.
  • Medallion Architecture: Implement Bronze, Silver, and gold tables for scalable data systems.
  • Spark Code Optimization: Optimize Spark code to ensure efficient cloud migration.
  • Data Modeling: Develop and maintain data models with strong governance practices.
  • Data Cataloging & Quality: Implement cataloging strategies with Unity Catalog to maintain high-quality data.
  • Delta Live Table Leadership: Lead the design and implementation of Delta Live Tables (DLT) pipelines for secure, tamper-resistant data management.
  • Customer Collaboration: Collaborate with clients to optimize cloud migrations and ensure best practices in design and governance.

Educational Qualifications

  • Experience: Minimum 5 years of hands-on experience in data engineering, with a proven track record in complex pipeline development and cloud-based data migration projects.
  • Education: Bachelor’s or higher degree in Computer Science, Data Engineering, or a related field.
  • Skills
  • Must-have: Proficiency in Spark, SQL, Python, and other relevant data processing technologies. Strong knowledge of Databricks and its components, including Delta Live Table (DLT) pipeline implementations. Expertise in on-premises to cloud Spark code optimization and Medallion Architecture.

Good to Have

  • Familiarity with AWS services (experience with additional cloud platforms like GCP or Azure is a plus).

Soft Skills

  • Excellent communication and collaboration skills, with the ability to work effectively with clients and internal teams.
  • Certifications
  • AWS/GCP/Azure Data Engineer Certification.


Read more
Appsforbharat
Pooja V
Posted by Pooja V
Bengaluru (Bangalore)
6 - 13 yrs
Best in industry
skill iconGo Programming (Golang)
skill iconPython
skill iconAmazon Web Services (AWS)
SQL

About the role


We are seeking a seasoned Backend Tech Lead with deep expertise in Golang and Python to lead our backend team. The ideal candidate has 6+ years of experience in backend technologies and 2–3 years of proven engineering mentoring experience, having successfully scaled systems and shipped B2C applications in collaboration with product teams.

Responsibilities

Technical & Product Delivery

● Oversee design and development of backend systems operating at 10K+ RPM scale.

● Guide the team in building transactional systems (payments, orders, etc.) and behavioral systems (analytics, personalization, engagement tracking).

● Partner with product managers to scope, prioritize, and release B2C product features and applications.

● Ensure architectural best practices, high-quality code standards, and robust testing practices.

● Own delivery of projects end-to-end with a focus on scalability, reliability, and business impact.

Operational Excellence

● Champion observability, monitoring, and reliability across backend services.

● Continuously improve system performance, scalability, and resilience.

● Streamline development workflows and engineering processes for speed and quality.

Requirements

Experience:

7+ years of professional experience in backend technologies.

2-3 years as Tech lead and driving delivery.

● Technical Skills:

Strong hands-on expertise in Golang and Python.

Proven track record with high-scale systems (≥10K RPM).

Solid understanding of distributed systems, APIs, SQL/NoSQL databases, and cloud platforms.

Leadership Skills:

Demonstrated success in managing teams through 2–3 appraisal cycles.

Strong experience working with product managers to deliver consumer-facing applications.

● Excellent communication and stakeholder management abilities.

Nice-to-Have

● Familiarity with containerization and orchestration (Docker, Kubernetes).

● Experience with observability tools (Prometheus, Grafana, OpenTelemetry).

● Previous leadership experience in B2C product companies operating at scale.

What We Offer

● Opportunity to lead and shape a backend engineering team building at scale.

● A culture of ownership, innovation, and continuous learning.

● Competitive compensation, benefits, and career growth opportunities.

Read more
Bidgely

at Bidgely

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
6yrs+
Upto ₹65L / yr (Varies
)
skill iconJava
skill iconSpring Boot
SQL
NOSQL Databases
skill iconAmazon Web Services (AWS)

Lead Software Engineer

Bidgely is seeking an exceptional and visionary Lead Software Engineer to join its core team in Bangalore. As a Lead Software Engineer, you will be working closely with EMs and org heads in shaping the roadmap and planning and set the technical direction for the team, influence architectural decisions, and mentor other engineers while delivering highly reliable, scalable products powered by large data, advanced machine learning models, and responsive user interfaces. Renowned for your deep technical expertise, you are capable of deconstructing any system, solving complex problems creatively, and elevating those around you. Join our innovative and dynamic team that thrives on creativity, technical excellence, and a belief that nothing is impossible with collaboration and hard work.


Responsibilities

  • Lead the design and delivery of complex, scalable web services, APIs, and backend data modules.
  • Define and drive adoption of best practices in system architecture, component reusability, and software design patterns across teams.
  • Provide technical leadership in product, architectural, and strategic engineering discussions.
  • Mentor and guide engineers at all levels, fostering a culture of learning and growth.
  • Collaborate with cross-functional teams (engineering, product management, data science, and UX) to translate business requirements into scalable, maintainable solutions.
  • Champion and drive continuous improvement initiatives for code quality, performance, security, and reliability.
  • Evaluate and implement emerging technologies, tools, and methodologies to ensure competitive advantage.
  • Present technical concepts and results clearly to both technical and non-technical stakeholders; influence organizational direction and recommend key technical investments.


Requirements

  • 6+ years of experience in designing and developing highly scalable backend and middle tier systems.
  • BS/MS/PhD in Computer Science or a related field from a leading institution.
  • Demonstrated mastery of data structures, algorithms, and system design; experience architecting large-scale distributed systems and leading significant engineering projects.
  • Deep fluency in Java, Spring, Hibernate, J2EE, RESTful services; expertise in at least one additional backend language/framework.
  • Strong hands-on experience with both SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB, Cassandra, Redis) databases, including schema design, optimization, and performance tuning for large data sets.
  • Experience with Distributed Systems, Cloud Architectures, CI/CD, and DevOps principles.
  • Strong leadership, mentoring, and communication skills; proven ability to drive technical vision and alignment across teams.
  • Track record of delivering solutions in fast-paced and dynamic start-up environments.
  • Commitment to quality, attention to detail, and a passion for coaching others.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4 - 7 yrs
Best in industry
skill iconPython
pandas
NumPy
SQL
skill iconHTML/CSS
+4 more

Specific Knowledge/Skills


  1. 4-6 years of experience
  2. Proficiency in Python programming.
  3. Basic knowledge of front-end development.
  4. Basic knowledge of Data manipulation and analysis libraries
  5. Code versioning and collaboration. (Git)
  6. Knowledge for Libraries for extracting data from websites.
  7. Knowledge of SQL and NoSQL databases
  8. Familiarity with RESTful APIs
  9. Familiarity with Cloud (Azure /AWS) technologies
Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
6 - 9 yrs
₹30L - ₹48L / yr
skill iconPython
skill iconReact.js
skill iconNodeJS (Node.js)
TypeScript
ReAct (Reason + Act)
+13 more

Review Criteria:

  • Strong Software Engineer fullstack profile using NodeJS / Python and React
  • 6+ YOE in Software Development using Python OR NodeJS (For backend) & React (For frontend)
  • Must have strong experience in working on Typescript
  • Must have experience in message-based systems like Kafka, RabbitMq, Redis
  • Databases - PostgreSQL & NoSQL databases like MongoDB
  • Product Companies Only
  • Tier 1 Engineering Institutes (IIT, NIT, BITS, IIIT, DTU or equivalent)

 

Preferred:

  • Experience in Fin-Tech, Payment, POS and Retail products is highly preferred
  • Experience in mentoring, coaching the team.


Role & Responsibilities:

We are currently seeking a Senior Engineer to join our Financial Services team, contributing to the design and development of scalable system.

 

The Ideal Candidate Will Be Able To-

  • Take ownership of delivering performant, scalable and high-quality cloud-based software, both frontend and backend side.
  • Mentor team members to develop in line with product requirements.
  • Collaborate with Senior Architect for design and technology choices for product development roadmap.
  • Do code reviews.


Ideal Candidate:

  • Thorough knowledge of developing cloud-based software including backend APIs and react based frontend.
  • Thorough knowledge of scalable design patterns and message-based systems such as Kafka, RabbitMq, Redis, MongoDB, ORM, SQL etc.
  • Experience with AWS services such as S3, IAM, Lambda etc.
  • Expert level coding skills in Python FastAPI/Django, NodeJs, TypeScript, ReactJs.
  • Eye for user responsive designs on the frontend.


Perks, Benefits and Work Culture:

  • We prioritize people above all else. While we're recognized for our innovative technology solutions, it's our people who drive our success. That’s why we offer a comprehensive and competitive benefits package designed to support your well-being and growth:
  • Medical Insurance with coverage up to INR 8,00,000 for the employee and their family
Read more
CoverSelf Technologies

at CoverSelf Technologies

5 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
5 - 9 yrs
Upto ₹24L / yr (Varies
)
Selenium
skill iconJava
SQL
NOSQL Databases
Selenium Web driver
+1 more

Qualifications:

  • Must have a Bachelor’s degree in computer science or equivalent.
  • Must have at least 5+ years’ experience as a SDET.
  • At least 1+ year of leadership experience or managing a team.

Responsibilities:

  • Design, develop and execute automation scripts using open-source tools.
  • Troubleshooting any errors and streamlining the testing procedures.
  • Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
  • Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
  • Good time-management skills and commitment to meet deadlines.
  • Stay up-to-date with new testing tools and test strategies.
  • Driving technical projects and providing leadership in an innovative and fast-paced environment.

Requirements:

  • Experience in the Automation - API and UI as well as Manual Testing on Web Application.
  • Experience in frameworks like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
  • Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
  • Must be proficient in Core Java / Type Script and Java 17.
  • Experience in JUnit-5.
  • Good to have TypeScript experience.
  • Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
  • Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
  • Good understanding of software & systems architecture.
  • Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
  • Strong experience REST based components testing, back-end, DB and micro services testing.


Work Location: Jayanagar - Bangalore.

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Indore, Pune, Bhopal, Mumbai, Nagpur, Kolkata, Bengaluru (Bangalore), Chennai
4 - 6 yrs
₹4.5L - ₹18L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL

🚀 Hiring: Java Developer at Deqode

⭐ Experience: 4+ Years

📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


Requirements

✅ Strong proficiency in Java (Java 8/11/17)

✅ Experience with Spring / Spring Boot

✅ Knowledge of REST APIs, Microservices architecture

✅ Familiarity with SQL/NoSQL databases

✅ Understanding of Git, CI/CD pipelines

✅ Problem-solving skills and attention to detail


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shivangi Bhattacharyya
Posted by Shivangi Bhattacharyya
Bengaluru (Bangalore)
6 - 10 yrs
Best in industry
skill iconPython
Generative AI
skill iconMachine Learning (ML)
SQL
Business Intelligence (BI)
+1 more

Job Description: 


Exp Range - [6y to 10y]


Qualifications:


  • Minimum Bachelors Degree in Engineering or Computer Applications or AI/Data science
  • Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
  • Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred


Require Skills: 

  • Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
  • Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
  • Must Have – Experience working with models such as Random forest, Kmeans clustering, BERT…
  • Should Have – Exposure to querying warehouses and APIs
  • Should Have – Experience with writing moderate to complex SQL queries
  • Should Have – Experience analyzing and presenting data with BI tools or Excel
  • Must Have – Very strong communication skills to work with technical and non technical stakeholders in a global environment

 

Roles and Responsibilities:

  • Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
  • Analyse and present insights about the data and processes to Business Stakeholders
  • Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
  • Develop and deploy customized models on Production data sets to generate analytical insights and predictions
  • Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
  • Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
  • Share knowledge and best practices with broader teams to make everyone aware and more productive.


Read more
Aryush Infotech India Pvt Ltd
Nitin Gupta
Posted by Nitin Gupta
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹4L / yr
Fintech
Test Automation (QA)
Manual testing
skill iconPostman
JIRA
+5 more

Job Title: QA Tester – FinTech (Manual + Automation Testing)

Location: Bangalore, India

Job Type: Full-Time

Experience Required: 3 Years

Industry: FinTech / Financial Services

Function: Quality Assurance / Software Testing

 

About the Role:

We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.

 

Key Responsibilities:

  • Analyze business and functional requirements for financial products and translate them into test scenarios.
  • Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
  • Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
  • Conduct API testing using Postman, Rest Assured, or similar tools.
  • Perform functional, regression, integration, and system testing across web and mobile platforms.
  • Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
  • Log and track defects using JIRA or a similar defect management tool.
  • Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
  • Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
  • Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.

 

Required Skills and Experience:

  • 3+ years of hands-on experience in manual and automation testing.
  • Solid understanding of QA methodologies, STLC, and SDLC.
  • Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
  • Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
  • Knowledge of API testing, including RESTful services.
  • Familiarity with SQL to validate data in databases.
  • Understanding of CI/CD processes and basic scripting for automation integration.
  • Good problem-solving skills and attention to detail.
  • Excellent communication and documentation skills.

 

Preferred Qualifications:

  • Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
  • Experience with mobile app testing (iOS/Android).
  • Working knowledge of test management tools like TestRail, Zephyr, or Xray.
  • Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
  • Basic knowledge of version control systems (e.g., Git).


Read more
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data engineering
Data architecture
SQL
Data modeling
GCS
+47 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort