Cutshort logo
SQL Jobs in Bangalore (Bengaluru)

50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)

Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Wissen Technology

at Wissen Technology

4 recruiters
Shivangi Bhattacharyya
Posted by Shivangi Bhattacharyya
Bengaluru (Bangalore)
6 - 10 yrs
Best in industry
skill iconPython
Generative AI
skill iconMachine Learning (ML)
SQL
Business Intelligence (BI)
+1 more

Job Description: 


Exp Range - [6y to 10y]


Qualifications:


  • Minimum Bachelors Degree in Engineering or Computer Applications or AI/Data science
  • Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
  • Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred


Require Skills: 

  • Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
  • Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
  • Must Have – Experience working with models such as Random forest, Kmeans clustering, BERT…
  • Should Have – Exposure to querying warehouses and APIs
  • Should Have – Experience with writing moderate to complex SQL queries
  • Should Have – Experience analyzing and presenting data with BI tools or Excel
  • Must Have – Very strong communication skills to work with technical and non technical stakeholders in a global environment

 

Roles and Responsibilities:

  • Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
  • Analyse and present insights about the data and processes to Business Stakeholders
  • Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
  • Develop and deploy customized models on Production data sets to generate analytical insights and predictions
  • Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
  • Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
  • Share knowledge and best practices with broader teams to make everyone aware and more productive.


Read more
Industry Automation

Industry Automation

Agency job
via Michael Page by Pramod P
Bengaluru (Bangalore)
5 - 9 yrs
₹20L - ₹30L / yr
skill iconC#
Microsoft Windows Azure
API
SQL
NOSQL Databases
+3 more

Your job: • Develop and maintain software components, including APIs and microservices

• Optimize backend systems on Microsoft Azure using App Services, Functions, and AzureSQL

• Contribute to frontend development as needed in a full-stack capacity

• Participate in code reviews, unit testing, and bug fixing to ensure high code quality

• Collaborate with the development team, product owner, and DevOps team in agile projects

• Maintain clear and comprehensive technical documentation for all feature and APIs


Your qualification:

• Master’s or bachelor’s degree in computer science

• 5 to 8yearsofexperienceinbackendwebapplicationdevelopment

• Expertise in backend technologies such as C#/.NET Core and in databases, including SQL and NoSQL (AzureSQL, Cosmos DB)

• Experience with Microsoft Azure services (App Services, Functions, SQL) and familiarity with frontend technologies (JavaScript/TypeScript and/ or Angular) would be an added advantage

• Proficiency in cloud-based backend development, full-stack development, and software optimization

• Experience with agile methodologies, unit testing, automated testing, and CI/CD pipelines would be beneficial • Excellent written and spoken English communications kills

Read more
Aryush Infotech India Pvt Ltd
Nitin Gupta
Posted by Nitin Gupta
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹4L / yr
Fintech
Test Automation (QA)
Manual testing
skill iconPostman
JIRA
+5 more

Job Title: QA Tester – FinTech (Manual + Automation Testing)

Location: Bangalore, India

Job Type: Full-Time

Experience Required: 3 Years

Industry: FinTech / Financial Services

Function: Quality Assurance / Software Testing

 

About the Role:

We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.

 

Key Responsibilities:

  • Analyze business and functional requirements for financial products and translate them into test scenarios.
  • Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
  • Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
  • Conduct API testing using Postman, Rest Assured, or similar tools.
  • Perform functional, regression, integration, and system testing across web and mobile platforms.
  • Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
  • Log and track defects using JIRA or a similar defect management tool.
  • Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
  • Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
  • Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.

 

Required Skills and Experience:

  • 3+ years of hands-on experience in manual and automation testing.
  • Solid understanding of QA methodologies, STLC, and SDLC.
  • Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
  • Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
  • Knowledge of API testing, including RESTful services.
  • Familiarity with SQL to validate data in databases.
  • Understanding of CI/CD processes and basic scripting for automation integration.
  • Good problem-solving skills and attention to detail.
  • Excellent communication and documentation skills.

 

Preferred Qualifications:

  • Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
  • Experience with mobile app testing (iOS/Android).
  • Working knowledge of test management tools like TestRail, Zephyr, or Xray.
  • Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
  • Basic knowledge of version control systems (e.g., Git).


Read more
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 10 yrs
₹20L - ₹45L / yr
dremio
lakehouse
Data architecture
Data engineering
SQL
+48 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 5 yrs
₹4L - ₹8L / yr
skill iconPython
API
SQL

An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.

Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.

Key Responsibilities

  • Advanced Troubleshooting & Incident Management:
  • Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
  • Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
  • Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
  • Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
  • Python-Specific Tasks:
  • Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
  • Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
  • Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
  • Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
  • Collaboration and Escalation:
  • Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
  • Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
  • Documentation and Process Improvement:
  • Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
  • Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
  • Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
  • Customer Communication:
  • Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.

Required Technical Skills

  • Programming/Scripting:
  • Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
  • Experience with other scripting languages like Bash or Shell
  • Databases:
  • Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
  • Application/Web Technologies:
  • Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
  • Knowledge of application architectures (e.g., microservices, SOA) is a plus.
  • Monitoring & Tools:
  • Experience with support ticketing systems (e.g., JIRA, ServiceNow).
  • Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)


Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 4 yrs
₹4L - ₹10L / yr
skill iconPython
API
SQL

An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.

Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.

Key Responsibilities

  • Advanced Troubleshooting & Incident Management:
  • Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
  • Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
  • Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
  • Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
  • Python-Specific Tasks:
  • Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
  • Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
  • Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
  • Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
  • Collaboration and Escalation:
  • Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
  • Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
  • Documentation and Process Improvement:
  • Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
  • Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
  • Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
  • Customer Communication:
  • Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.

Required Technical Skills

  • Programming/Scripting:
  • Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
  • Experience with other scripting languages like Bash or Shell
  • Databases:
  • Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
  • Application/Web Technologies:
  • Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
  • Knowledge of application architectures (e.g., microservices, SOA) is a plus.
  • Monitoring & Tools:
  • Experience with support ticketing systems (e.g., JIRA, ServiceNow).
  • Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)


Read more
Bengaluru (Bangalore)
1 - 4 yrs
₹5L - ₹15L / yr
skill iconDjango
skill iconFlask
skill iconHTML/CSS
SQL

Job Responsibilities :


- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.


- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.


- Ability to take full ownership of the software development lifecycle from requirement to release.


- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.


- Embrace design and code reviews to deliver quality code.


- Play a key role in taking Trendlyne to the next level as a world-class engineering team


-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.


- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.


- Develop and maintain scalable, robust backend systems using Python and Django framework.


- Proficient understanding of the performance of web and mobile applications.


- Mentor junior developers and foster skill development within the team.


Job Requirements :


- 1+ years of experience with Python and Django.


- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.


- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS


Who are we :


Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.


Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.


What do we do :


We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.

Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Uni Cards

at Uni Cards

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
1yr+
Upto ₹22L / yr (Varies
)
SQL
Stakeholder management
Agile/Scrum
JIRA
Asana
+5 more

We’re looking for a Program Manager-1 to join our Growth team- someone who thrives in fast- paced environments and can turn user insights into measurable impact. You’ll work across product and business functions to drive growth, optimize funnels, and enhance the user journey.


What You’ll Do

  • Own parts of the user journey and drive improvements across acquisition, activation, and retention funnels.
  • Partner with Product, Marketing, Engineering, and Design teams to identify growth opportunities and execute data-backed experiments.
  • Use data and user insights to pin point drop-offs and design solutions that improve conversion.
  •   Build, track, and measure growth metrics and KPIs.
  • Bring structure and clarity to ambiguous problems and drive alignment across teams.
  •   Stay on top of product trends and best practices to inspire new growth ideas.


What We’re Looking For

  • Graduate from a Tier 1 institute (IITs, IIMs, ISB, BITS, etc.)
  • 2 - 2.5 years of experience, preferably in a B2C startup(not early-stage).
  • Exposure to digital products or services is a plus.
  • Experience working closely with product and business teams. Strong analytical skills and structured thinking
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
SQL

Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.

Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹18L / yr
Dot Net
skill iconAngular (2+)
Windows Azure
SQL
skill iconC#
+3 more

Skills required:

  • Strong expertise in .NET Core / ASP.NET MVC
  • Candidate must have 4+ years of experience in Dot Net.
  • Candidate must have experience with Angular.
  • Hands-on experience with Entity Framework & LINQ
  • Experience with SQL Server (performance tuning, stored procedures, indexing)
  • Understanding of multi-tenancy architecture
  • Experience with Microservices / API development (REST, GraphQL)
  • Hands-on experience in Azure Services (App Services, Azure SQL, Blob Storage, Key Vault, Functions, etc.)
  • Experience in CI/CD pipelines with Azure DevOps
  • Knowledge of security best practices in cloud-based applications
  • Familiarity with Agile/Scrum methodologies
  • Flexible to use copilot or any other AI tool to write automated test cases and faster code writing

Roles and Responsibilities:

- Good communication Skills is must.

- Develop features across multiple subsystems within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, and deployment.

- Understand how our applications operate, are structured, and how customers use them

- Provide engineering support (when necessary) to our technical operations staff when they are building, deploying, configuring, and supporting systems for customers.

Read more
Nuware Systems
Bengaluru (Bangalore)
5 - 10 yrs
Upto ₹25L / yr (Varies
)
UFT
Software Testing (QA)
SQL
Shell Scripting
Manual testing

About Nuware

NuWare is a global technology and IT services company built on the belief that organizations require transformational strategies to scale, grow and build into the future owing to a dynamically evolving ecosystem. We strive towards our clients’ success in today’s hyper-competitive market by servicing their needs with next-gen technologies - AI/ML, NLP, chatbots, digital and automation tools.


We empower businesses to enhance their competencies, processes and technologies to fully leverage opportunities and accelerate impact. Through our focus on market differentiation and innovation - we offer services that are agile, streamlined, efficient and customer-centric.


Headquartered in Iselin, NJ, NuWare has been creating business value and generating growth opportunities for clients through its network of partners, global resources, highly skilled talent and SME’s for 25 years. NuWare is technology agnostic and offers services for Systems Integration, Cloud, Infrastructure Management, Mobility, Test automation, Data Sciences and Social & Big Data Analytics.


Skills Required

  • Automation testing with UFT, strong into SQL, Good communication skills
  • 5 years of experience in automation testing
  • Experience with UFT for at least 3 years
  • Good knowledge of VB Scripting
  • Knowledge of Manual testing
  • Knowledge of automation frameworks
Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹24L / yr
SaaS
Software implementation
Customer Success
Implementation
Tech Support
+8 more

Review Criteria

  • Strong Implementation Manager / Customer Success Implementation / Technical Solutions / Post-Sales SaaS Delivery
  • 3+ years of hands-on experience in software/tech Implementation roles within technical B2B SaaS companies, preferably working with global or US-based clients
  • Must have direct experience leading end-to-end SaaS product implementations — including onboarding, workflow configuration, API integrations, data setup, and customer training
  • Must have strong technical understanding — including ability to read and write basic SQL queries, debug API workflows, and interpret JSON payloads for troubleshooting or configuration validation.
  • Must have worked in post-sales environments, owning customer success and delivery after deal closure, ensuring product adoption, accurate setup, and smooth go-live.
  • Must have experience collaborating cross-functionally with product, engineering, and sales teams to ensure timely resolution of implementation blockers and seamless client onboarding.
  • (Company): B2B SaaS startup or growth-stage company
  • Mandatory (Note): Good growth opportunity, this role will have team leading option after a few months


Preferred

  • Preferred (Experience): Previous experience in FinTech SaaS like BillingTech, finance automation, or subscription management platforms will be a strong plus


Job Specific Criteria

  • CV Attachment is mandatory
  • Are you open to work in US timings (4/5:00 PM - 3:00 AM) - to target the US market?
  • Please provide CTC Breakup (Fixed + Variable)?
  • It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
  • It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?

 

Role & Responsibilities

As the new hire in this role, you'll be the voice of the customer in the company, and lead the charge in developing our customer-centric approach, working closely with our tech, design, and product teams.

 

What you will be doing:

You will be responsible for converting, onboarding, managing, and proactively ensuring success for our customers/prospective clients.

  • Implementation
  • Understand client billing models and configure company contracts, pricing, metering, and invoicing accurately.
  • Lead pilots and implementation for new customers, ensuring complete onboarding within 3–8 weeks.
  • Translate complex business requirements into structured company workflows and setup.
  • Pre-sales & Technical Discovery
  • Support sales with live demos, sandbox setups, and RFP responses.
  • Participate in technical discovery calls to map company capabilities to client needs.
  • Create and maintain demo environments showcasing relevant use cases.
  • Internal Coordination & Escalation
  • Act as the voice of the customer internally — share structured feedback with product and engineering.
  • Create clear, well-scoped handoff documents when working with technical teams.
  • Escalate time-sensitive issues appropriately and follow through on resolution.
  • Documentation & Enablement
  • Create client-specific documentation (e.g., onboarding guides, configuration references).
  • Contribute to internal wikis, training material, and product documentation.
  • Write simple, to-the-point communication — clear enough for a CXO and detailed enough for a developer.

 

Ideal Candidate

  • 3-7 years of relevant experience
  • Willing to work in US time zone (~430 am IST) on weekdays (Mon-Fri)
  • Ability to understand and shape the product at a granular level
  • Ability to empathize with the customers, and understand their pain points
  • Understanding of SaaS architecture and APIs conceptually — ability to debug API workflows and usage issues
  • Previous experience in salesforce CRM
  • Entrepreneurial drive, and willingness to wear multiple hats as per company’s requirements
  • Strong analytical skills and a structured problem-solving approach
  • (Strongly preferred) Computer science background and basic coding experience
  • Ability to understand functional aspects related to the product e.g., accounting/revenue recognition, receivables, billing etc
  • Self-motivated and proactive in managing tasks and responsibilities, requiring minimal follow-ups.
  • Self-driven individual with high ownership and strong work ethic
  • Not taking yourself too seriously.


Read more
LogIQ Labs Pvt.Ltd.
Bengaluru (Bangalore), Pune, Hyderabad, Noida
3 - 5 yrs
₹4L - ₹10L / yr
Playwright
SQL

Functional Testing & Validation

  • Web Application Testing: Design, document, and execute comprehensive functional test plans and test cases for complex, highly interactive web applications, ensuring they meet specified requirements and provide an excellent user experience.
  • Backend API Testing: Possess deep expertise in validating backend RESTful and/or SOAP APIs. This includes testing request/response payloads, status codes, data integrity, security, and robust error handling mechanisms.
  • Data Validation with SQL: Write and execute complex SQL queries (joins, aggregations, conditional logic) to perform backend data checks, verify application states, and ensure data integrity across integration points.
  • I Automation (Playwright & TypeScript):
  • Design, develop, and maintain robust, scalable, and reusable UI automation scripts using Playwright and TypeScript.
  • Integrate automation suites into Continuous Integration/Continuous Deployment (CI/CD) pipelines.
  • Implement advanced automation patterns and frameworks (e.g., Page Object Model) to enhance maintainability.
  • Prompt-Based Automation: Demonstrate familiarity or hands-on experience with emerging AI-driven or prompt-based automation approaches and tools to accelerate test case generation and execution.
  • API Automation: Develop and maintain automated test suites for APIs to ensure reliability and performance.

3. Performance & Load Testing

  • JMeter Proficiency: Utilize Apache JMeter to design, script, and execute robust API load testing and stress testing scenarios.
  • Analyse performance metrics, identify bottlenecks (e.g., response time, throughput), and provide actionable reports to development teams.


🛠️ Required Skills and Qualifications

  • Experience: 4+ years of professional experience in Quality Assurance and Software Testing, with a strong focus on automation.
  • Automation Stack: Expert-level proficiency in developing and maintaining automation scripts using Playwright and TypeScript.
  • Testing Tools: Proven experience with API testing tools (e.g., Postman, Swagger) and strong functional testing methodologies.
  • Database Skills: Highly proficient in writing and executing complex SQL queries for data validation and backend verification.
  • Performance: Hands-on experience with Apache JMeter for API performance and load testing.
  • Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams (Developers, Product Managers).
  • Problem-Solving: Strong analytical and debugging skills to efficiently isolate and report defects.


Read more
AryuPay Technologies
Bhavana Chaudhari
Posted by Bhavana Chaudhari
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹5L / yr
Search Engine Optimization (SEO)
SQL
On-page Optimization
off page seo
skill iconGoogle Analytics
+3 more

Job Description – SEO Specialist

Company: Capace Software Pvt. Ltd.

Location: Bhopal / Bangalore (On-site)

Experience: 2+ Years

Budget: Up to ₹4 LPA

Position: Full-Time


About the Role

Capace Software Pvt. Ltd. is looking for a skilled SEO Specialist with strong expertise in On-Page SEO, Off-Page SEO, and Technical SEO. The ideal candidate will be responsible for improving our search engine ranking, driving organic traffic, and ensuring technical search requirements are met across websites.


Key Responsibilities

🔹 On-Page SEO

  • Optimize meta titles, descriptions, header tags, and URLs
  • Conduct in-depth keyword research and implement strategic keyword placement
  • Optimize website content for relevancy and readability
  • Implement internal linking strategies
  • Optimize images, schema, and site structure for SEO
  • Ensure webpages follow SEO best practices

🔹 Off-Page SEO

  • Create and execute backlink strategies
  • Manage directory submissions, social bookmarking, classified listings
  • Conduct competitor backlink analysis
  • Build high-quality guest post links and outreach
  • Improve brand visibility through digital promotions


🔹 Technical SEO

  • Conduct website audits (crawl errors, index issues, technical fixes)
  • Optimize website speed and performance
  • Implement schema markup and structured data
  • Manage XML sitemaps and robots.txt
  • Resolve indexing, crawling, and canonical issues
  • Work with developers to implement technical updates


Requirements

  • Minimum 2+ years of experience in SEO
  • Strong knowledge of On-Page, Off-Page & Technical SEO
  • Experience with tools like:
  • Google Analytics
  • Google Search Console
  • Ahrefs / SEMrush / Ubersuggest
  • Screaming Frog (good to have)
  • Understanding of HTML, CSS basics (preferred)
  • Strong analytical and reporting skills
  • Good communication and documentation skills


What We Offer

  • Competitive salary up to ₹4 LPA
  • Opportunity to work on multiple SaaS products and websites
  • Supportive team & learning-focused environment
  • Career growth in digital marketing & SEO domain
Read more
Tarento Group

at Tarento Group

3 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
4yrs+
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Windows Azure
RESTful APIs
+5 more

Job Summary:

We are seeking a highly skilled and self-driven Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.Key Responsibilities:

  • Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
  • Implement and maintain RESTful APIs, ensuring high performance and scalability.
  • Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
  • Develop and manage Docker containers, enabling efficient development and deployment pipelines.
  • Integrate messaging services like Apache Kafka into microservice architectures.
  • Design and maintain data models using PostgreSQL or other SQL databases.
  • Implement unit testing using JUnit and mocking frameworks to ensure code quality.
  • Develop and execute API automation tests using Cucumber or similar tools.
  • Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
  • Work with Kubernetes for orchestrating containerized services.
  • Utilize Couchbase or similar NoSQL technologies when necessary.
  • Participate in code reviews, design discussions, and contribute to best practices and standards.

Required Skills & Qualifications:

  • Strong experience in Java (11 or above) and Spring Boot framework.
  • Solid understanding of microservices architecture and deployment on Azure.
  • Hands-on experience with Docker, and exposure to Kubernetes.
  • Proficiency in Kafka, with real-world project experience.
  • Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
  • Experience in writing unit tests using JUnit and mocking tools.
  • Experience with Cucumber or similar frameworks for API automation testing.
  • Exposure to CI/CD toolsDevOps processes, and Git-based workflows.

Nice to Have:

  • Azure certifications (e.g., Azure Developer Associate)
  • Familiarity with Couchbase or other NoSQL databases.
  • Familiarity with other cloud providers (AWS, GCP)
  • Knowledge of observability tools (Prometheus, Grafana, ELK)

Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent verbal and written communication.
  • Ability to work in an agile environment and contribute to continuous improvement.

Why Join Us:

  • Work on cutting-edge microservice architectures
  • Strong learning and development culture
  • Opportunity to innovate and influence technical decisions
  • Collaborative and inclusive work environment
Read more
Bengaluru (Bangalore)
6 - 10 yrs
₹15L - ₹28L / yr
Business Analysis
Data integration
SQL
PMS
CRS
+2 more

Job Description: Business Analyst – Data Integrations

Location: Bangalore / Hybrid / Remote

Company: LodgIQ

Industry: Hospitality / SaaS / Machine Learning

About LodgIQ

Headquartered in New York, LodgIQ delivers a revolutionary B2B SaaS platform to the

travel industry. By leveraging machine learning and artificial intelligence, we enable precise

forecasting and optimized pricing for hotel revenue management. Backed by Highgate

Ventures and Trilantic Capital Partners, LodgIQ is a well-funded, high-growth startup with a

global presence.

About the Role

We’re looking for a skilled Business Analyst – Data Integrations who can bridge the gap

between business operations and technology teams, ensuring smooth, efficient, and scalable

integrations. If you’re passionate about hospitality tech and enjoy solving complex data

challenges, we’d love to hear from you!

What You’ll Do

Key Responsibilities

 Collaborate with vendors to gather requirements for API development and ensure

technical feasibility.

 Collect API documentation from vendors; document and explain business logic to

use external data sources effectively.

 Access vendor applications to create and validate sample data; ensure the accuracy

and relevance of test datasets.

 Translate complex business logic into documentation for developers, ensuring

clarity for successful integration.

 Monitor all integration activities and support tickets in Jira, proactively resolving

critical issues.

 Lead QA testing for integrations, overseeing pilot onboarding and ensuring solution

viability before broader rollout.

 Document onboarding processes and best practices to streamline future

integrations and improve efficiency.

 Build, train, and deploy machine learning models for forecasting, pricing, and

optimization, supporting strategic goals.

 Drive end-to-end execution of data integration projects, including scoping, planning,

delivery, and stakeholder communication.

 Gather and translate business requirements into actionable technical specifications,

liaising with business and technical teams.


 Oversee maintenance and enhancement of existing integrations, performing RCA

and resolving integration-related issues.

 Document workflows, processes, and best practices for current and future

integration projects.

 Continuously monitor system performance and scalability, recommending

improvements to increase efficiency.

 Coordinate closely with Operations for onboarding and support, ensuring seamless

handover and issue resolution.

Desired Skills & Qualifications

 Strong experience in API integration, data analysis, and documentation.

 Familiarity with Jira for ticket management and project workflow.

 Hands-on experience with machine learning model development and deployment.

 Excellent communication skills for requirement gathering and stakeholder

engagement.

 Experience with QA test processes and pilot rollouts.

 Proficiency in project management, data workflow documentation, and system

monitoring.

 Ability to manage multiple integrations simultaneously and work cross-functionally.

Required Qualifications

 Experience: Minimum 4 years in hotel technology or business analytics, preferably

handling data integration or system interoperability projects.

 Technical Skills:

 Basic proficiency in SQL or database querying.

 Familiarity with data integration concepts such as APIs or ETL workflows

(preferred but not mandatory).

 Eagerness to learn and adapt to new tools, platforms, and technologies.

 Hotel Technology Expertise: Understanding of systems such as PMS, CRS, Channel

Managers, or RMS.

 Project Management: Strong organizational and multitasking abilities.

 Problem Solving: Analytical thinker capable of troubleshooting and driving resolution.


 Communication: Excellent written and verbal skills to bridge technical and non-

technical discussions.


 Attention to Detail: Methodical approach to documentation, testing, and deployment.

Preferred Qualification

 Exposure to debugging tools and troubleshooting methodologies.

 Familiarity with cloud environments (AWS).

 Understanding of data security and privacy considerations in the hospitality industry.

Why LodgIQ?

 Join a fast-growing, mission-driven company transforming the future of hospitality.


 Work on intellectually challenging problems at the intersection of machine learning,

decision science, and human behavior.

 Be part of a high-impact, collaborative team with the autonomy to drive initiatives from

ideation to production.

 Competitive salary and performance bonuses.

 For more information, visit https://www.lodgiq.com

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹28L / yr
databricks
skill iconPython
SQL
PySpark
skill iconAmazon Web Services (AWS)
+9 more

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.


Skill Examples:

  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning.
  6. Experience in data warehouse design and cost improvements.
  7. Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Communicate and explain design/development aspects to customers.
  9. Estimate time and resource requirements for developing/debugging features/components.
  10. Participate in RFP responses and solutioning.
  11. Mentor team members and guide them in relevant upskilling and certification.

 

Knowledge Examples:

  1. Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  2. Proficient in SQL for analytics and windowing functions.
  3. Understanding of data schemas and models.
  4. Familiarity with domain-related data.
  5. Knowledge of data warehouse optimization techniques.
  6. Understanding of data security concepts.
  7. Awareness of patterns frameworks and automation practices.


 

Additional Comments:

# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026

Project Overview:

Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.

The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.

Design, build, and maintain scalable data pipelines using Databricks and PySpark.

Develop and optimize complex SQL queries for data extraction, transformation, and analysis.

Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).

Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.

Ensure data quality, performance, and reliability across data workflows.

Participate in code reviews, data architecture discussions, and performance optimization initiatives.

Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.


Key Skills:

Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).

Excellent problem-solving, communication, and collaboration skills.

 

Skills: Databricks, Pyspark & Python, Sql, Aws Services

 

Must-Haves

Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)

Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).


******

Notice period - Immediate to 15 days

Location: Bangalore

Read more
Mantle Solutions- A Lulu Group Company
Nikita Sinha
Posted by Nikita Sinha
Bangalore (Whitefield)
2 - 4 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
SQL
skill iconMachine Learning (ML)
skill iconData Analytics

We are seeking a hands-on eCommerce Analytics & Insights Lead to help establish and scale our newly launched eCommerce business. The ideal candidate is highly data-savvy, understands eCommerce deeply, and can lead KPI definition, performance tracking, insights generation, and data-driven decision-making.

You will work closely with cross-functional teams—Buying, Marketing, Operations, and Technology—to build dashboards, uncover growth opportunities, and guide the evolution of our online channel.


Key Responsibilities

Define & Monitor eCommerce KPIs

  • Set up and track KPIs across the customer journey: traffic, conversion, retention, AOV/basket size, repeat rate, etc.
  • Build KPI frameworks aligned with business goals.

Data Tracking & Infrastructure

  • Partner with marketing, merchandising, operations, and tech teams to define data tracking requirements.
  • Collaborate with eCommerce and data engineering teams to ensure data quality, completeness, and availability.

Dashboards & Reporting

  • Build dashboards and automated reports to track:
  • Overall site performance
  • Category & product performance
  • Marketing ROI and acquisition effectiveness

Insights & Performance Diagnosis

Identify trends, opportunities, and root causes of underperformance in areas such as:

  • Product availability & stock health
  • Pricing & promotions
  • Checkout funnel drop-offs
  • Customer retention & cohort behavior
  • Channel acquisition performance

Conduct:

  • Cohort analysis
  • Funnel analytics
  • Customer segmentation
  • Basket analysis

Data-Driven Growth Initiatives

  • Propose and evaluate experiments, optimization ideas, and quick wins.
  • Help business teams interpret KPIs and take informed decisions.

Required Skills & Experience

  • 2–5 years experience in eCommerce analytics (grocery retail experience preferred).
  • Strong understanding of eCommerce metrics and analytics frameworks (Traffic → Conversion → Repeat → LTV).
  • Proficiency with tools such as:
  • Google Analytics / GA4
  • Excel
  • SQL
  • Power BI or Tableau
  • Experience working with:
  • Digital marketing data
  • CRM and customer data
  • Product/category performance data
  • Ability to convert business questions into analytical tasks and produce clear, actionable insights.
  • Familiarity with:
  • Customer journey mapping
  • Funnel analysis
  • Basket and behavioral analysis
  • Comfortable working in fast-paced, ambiguous, and build-from-scratch environments.
  • Strong communication and stakeholder management skills.
  • Strong technical capability in at least one programming language: SQL or PySpark.

Good to Have

  • Experience with eCommerce platforms (Shopify, Magento, Salesforce Commerce, etc.).
  • Exposure to A/B testing, recommendation engines, or personalization analytics.
  • Knowledge of Python/R for deeper analytics (optional).
  • Experience with tracking setup (GTM, event tagging, pixel/event instrumentation).
Read more
Loyalytics

at Loyalytics

2 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 7 yrs
Upto ₹22L / yr (Varies
)
SQL
PowerBI
skill iconData Analytics
Customer Relationship Management (CRM)

In this role, you will drive and support customer analytics for HP’s online store business across the APJ region. You will lead campaign performance analytics, customer database intelligence, and enable data-driven targeting for automation and trigger programs. Your insights will directly shape customer engagement, marketing strategy, and business decision-making.


You will be part of the International Customer Management team, which focuses on customer strategy, base value, monetization, and brand consideration. As part of HP’s Digital Direct organization, you will support the company’s strategic transformation toward direct-to-customer excellence.


Join HP—a US$50B global technology leader known for innovation and being #1 in several business domains.


Key Responsibilities

Customer Insights & Analytics

  • Design and deploy customer success and engagement metrics across APJ.
  • Analyze customer behavior and engagement to drive data-backed marketing decisions.
  • Apply statistical techniques to translate raw data into meaningful insights.

Campaign Performance & Optimization

  • Elevate marketing campaigns across APJ by enabling advanced targeting criteria, performance monitoring, and test-and-learn frameworks.
  • Conduct campaign measurement, identifying trends, patterns, and optimization opportunities.

Data Management & Reporting

  • Develop a deep understanding of business data across markets.
  • Build and maintain SQL-based data assets: tables, stored procedures, scripts, queries, and SQL views.
  • Provide reporting and dashboards for marketing, sales, and CRM teams using Tableau or Power BI.
  • Measure and monitor strategic initiatives against KPIs and provide uplift forecasts for prioritization.

Required Experience

  • 4+ years of relevant experience (flexible for strong profiles).
  • Proficiency in SQL, including:
  • Database design principles
  • Query optimization
  • Data integrity checks
  • Building SQL views, stored procedures, and analytics-ready datasets
  • Experience translating analytics into business outcomes.
  • Hands-on experience analyzing campaign performance.
  • Expertise with data visualization tools such as Tableau or Power BI.
  • Experience with campaign management/marketing automation platforms (preferably Salesforce Marketing Cloud).

About You

  • Strong advocate of customer data–driven marketing.
  • Comfortable working hands-on with data and solving complex problems.
  • Confident communicator who can work with multiple cross-functional stakeholders.
  • Passionate about experimentation (test & learn) and continuous improvement.
  • Self-driven, accountable, and motivated by ownership.
  • Thrive in a diverse, international, dynamic environment.


Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Kochi (Cochin), Trivandrum, Hyderabad, Thiruvananthapuram
8 - 10 yrs
₹10L - ₹25L / yr
Business Analysis
Data Visualization
PowerBI
SQL
Tableau
+18 more

Job Description – Senior Technical Business Analyst

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

 

About the Role

We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.

As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.

 

Key Responsibilities

Business & Analytical Responsibilities

  • Partner with business teams to understand one-line problem statements and translate them into detailed business requirementsopportunities, and project scope.
  • Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
  • Create documentation including Business Requirement Documents (BRDs)user storiesprocess flows, and analytical models.
  • Break down business needs into concise, actionable, and development-ready user stories in Jira.

Data & Technical Responsibilities

  • Collaborate with data engineering teams to design, review, and validate data pipelinesdata models, and ETL/ELT workflows.
  • Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
  • Apply foundational data science concepts such as statistical analysispredictive modeling, and machine learning fundamentals.
  • Validate and ensure data quality, consistency, and accuracy across datasets and systems.

Collaboration & Execution

  • Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
  • Assist in development, testing, and rollout of data-driven solutions.
  • Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.

 

Required Skillsets

Core Technical Skills

  • 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
  • Data Analytics: SQL, descriptive analytics, business problem framing.
  • Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
  • Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
  • Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.

 

Soft Skills

  • Strong analytical thinking and structured problem-solving capability.
  • Ability to convert business problems into clear technical requirements.
  • Excellent communication, documentation, and presentation skills.
  • High curiosity, adaptability, and eagerness to learn new tools and techniques.

 

Educational Qualifications

  • BE/B.Tech or equivalent in:
  • Computer Science / IT
  • Data Science

 

What We Look For

  • Demonstrated passion for data and analytics through projects and certifications.
  • Strong commitment to continuous learning and innovation.
  • Ability to work both independently and in collaborative team environments.
  • Passion for solving business problems using data-driven approaches.
  • Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.

 

Why Join Us?

  • Exposure to modern data platforms, analytics tools, and AI technologies.
  • A culture that promotes innovation, ownership, and continuous learning.
  • Supportive environment to build a strong career in data and analytics.

 

Skills: Data Analytics, Business Analysis, Sql


Must-Haves

Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R

 

******

Notice period - 0 to 15 days (Max 30 Days)

Educational Qualifications: BE/B.Tech or equivalent in: (Computer Science / IT) /Data Science

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

Read more
Albert Invent

at Albert Invent

4 candid answers
3 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹30L / yr (Varies
)
skill iconPython
AWS Lambda
Amazon Redshift
Snow flake schema
SQL

To design, build, and optimize scalable data infrastructure and pipelines that enable efficient

data collection, transformation, and analysis across the organization. The Senior Data Engineer

will play a key role in driving data architecture decisions, ensuring data quality and availability,

and empowering analytics, product, and engineering teams with reliable, well-structured data to

support business growth and strategic decision-making.


Responsibilities:

• Develop, and maintain SQL and NoSQL databases, ensuring high performance,

scalability, and reliability.

• Collaborate with the API team and Data Science team to build robust data pipelines and

automations.

• Work closely with stakeholders to understand database requirements and provide

technical solutions.

• Optimize database queries and performance tuning to enhance overall system

efficiency.

• Implement and maintain data security measures, including access controls and

encryption.

• Monitor database systems and troubleshoot issues proactively to ensure uninterrupted

service.

• Develop and enforce data quality standards and processes to maintain data integrity.

• Create and maintain documentation for database architecture, processes, and

procedures.

• Stay updated with the latest database technologies and best practices to drive

continuous improvement.

• Expertise in SQL queries and stored procedures, with the ability to optimize and fine-tune

complex queries for performance and efficiency.

• Experience with monitoring and visualization tools such as Grafana to monitor database

performance and health.


Requirements:

• 4+ years of experience in data engineering, with a focus on large-scale data systems.

• Proven experience designing data models and access patterns across SQL and NoSQL

ecosystems.

• Hands-on experience with technologies like PostgreSQL, DynamoDB, S3, GraphQL, or

vector databases.

• Proficient in SQL stored procedures with extensive expertise in MySQL schema design,

query optimization, and resolvers, along with hands-on experience in building and

maintaining data warehouses.

• Strong programming skills in Python or JavaScript, with the ability to write efficient,

maintainable code.

• Familiarity with distributed systems, data partitioning, and consistency models.

• Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry) and

debugging production bottlenecks.

• Deep understanding of cloud infrastructure (preferably AWS), including networking, IAM,

and cost optimization.

• Prior experience building multi-tenant systems with strict performance and isolation

guarantees.

• Excellent communication and collaboration skills to influence cross-functional technical

decisions.

Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Delhi
3 - 6 yrs
₹10L - ₹25L / yr
Project Management
SQL
JIRA
SQL Query Analyzer
confluence
+23 more

Required Skills: Excellent Communication Skills, Project Management, SQL queries, Expertise with Tools such as Jira, Confluence etc.


Criteria:

  • Candidate must have Project management experience.
  • Candidate must have strong experience in accounting principles, financial workflows, and R2R (Record to Report) processes.
  • Candidate should have an academic background in Commerce or MBA Finance.
  • Candidates must be from a Fintech/ Financial service only.
  • Good experience with SQL and must have MIS experience.
  • Must have experience in Treasury Module.
  • 3+ years of implementation experience is required.
  • Candidate should have Hands-on experience with tools such as Jira, Confluence, Excel, and project management platforms.
  • Need candidate from Bangalore and Delhi/NCR ONLY.
  • Need Immediate joiner or candidate with up to 30 Days’ Notice period.

 

Description

Position Overview

We are looking for an experienced Implementation Lead with deep expertise in financial workflows, R2R processes, and treasury operations to drive client onboarding and end-to-end implementations. The ideal candidate will bring a strong Commerce / MBA Finance background, proven project management experience, and technical skills in SQL and ETL to ensure seamless deployments for fintech and financial services clients.


Key Responsibilities

  • Lead end-to-end implementation projects for enterprise fintech clients
  • Translate client requirements into detailed implementation plans and configure solutions accordingly.
  • Write and optimize complex SQL queries for data analysis, validation, and integration
  • Oversee ETL processes – extract, transform, and load financial data across systems
  • Collaborate with cross-functional teams including Product, Engineering, and Support
  • Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
  • Document processes, client requirements, and integration flows in detail.
  • Configure and deploy company solutions for R2R, treasury, and reporting workflows.


Required Qualifications

  • Bachelor’s degree Commerce background / MBA Finance (mandatory).
  • 3+ years of hands-on implementation/project management experience
  • Proven experience delivering projects in Fintech, SaaS, or ERP environments
  • Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
  • Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
  • Experience working with ETL pipelines or data migration processes
  • Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
  • Strong communication and stakeholder management skills
  • Ability to manage multiple projects simultaneously and drive client success


Qualifications

  • Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
  • Familiarity with API integrations and basic data mapping
  • Experience in agile/scrum-based implementation environments
  • Exposure to reconciliation, book closure, AR/AP, and reporting systems
  • PMP, CSM, or similar certifications



Skills & Competencies

Functional Skills

  • Financial process knowledge (e.g., reconciliation, accounting, reporting)
  • Business analysis and solutioning
  • Client onboarding and training
  • UAT coordination
  • Documentation and SOP creation

 

Project Skills

  • Project planning and risk management
  • Task prioritization and resource coordination
  • KPI tracking and stakeholder reporting

 

Soft Skills

  • Cross-functional collaboration
  • Communication with technical and non-technical teams
  • Attention to detail and customer empathy
  • Conflict resolution and crisis management


What We Offer

  • An opportunity to shape fintech implementations across fast-growing companies
  • Work in a dynamic environment with cross-functional experts
  • Competitive compensation and rapid career growth
  • A collaborative and meritocratic culture


 


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Swet Patel
Posted by Swet Patel
Bengaluru (Bangalore)
5 - 13 yrs
Best in industry
databricks
skill iconPython
SQL
PySpark
Spark

Key Responsibilities

We are seeking an experienced Data Engineer with a strong background in Databricks, Python, Spark/PySpark and SQL to design, develop, and optimize large-scale data processing applications. The ideal candidate will build scalable, high-performance data engineering solutions and ensure seamless data flow across cloud and on-premise platforms.

Key Responsibilities:

  • Design, develop, and maintain scalable data processing applications using DatabricksPython, and PySpark/Spark.
  • Write and optimize complex SQL queries for data extraction, transformation, and analysis.
  • Collaborate with data engineers, data scientists, and other stakeholders to understand business requirements and deliver high-quality solutions.
  • Ensure data integrity, performance, and reliability across all data processing pipelines.
  • Perform data analysis and implement data validation to ensure high data quality.
  • Implement and manage CI/CD pipelines for automated testing, integration, and deployment.
  • Contribute to continuous improvement of data engineering processes and tools.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Proven experience as a Databricks with strong expertise in Python, SQL and Spark/PySpark.
  • Strong proficiency in SQL, including working with relational databases and writing optimized queries.
  • Solid programming experience in Python, including data processing and automation.


Read more
Financial Services

Financial Services

Agency job
via Jobdost by Saida Pathan
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 6 yrs
₹20L - ₹25L / yr
Project Management
SQL
JIRA
confluence

Position Overview

We are looking for an experienced Implementation Lead with deep expertise in financial workflows, R2R processes, and treasury operations to drive client onboarding and end-to-end implementations. The ideal candidate will bring a strong Commerce / MBA Finance background, proven project management experience, and technical skills in SQL and ETL to ensure seamless deployments for fintech and financial services clients.


Key Responsibilities

  • Lead end-to-end implementation projects for enterprise fintech clients
  • Translate client requirements into detailed implementation plans and configure solutions accordingly.
  • Write and optimize complex SQL queries for data analysis, validation, and integration
  • Oversee ETL processes – extract, transform, and load financial data across systems
  • Collaborate with cross-functional teams including Product, Engineering, and Support
  • Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
  • Document processes, client requirements, and integration flows in detail.
  • Configure and deploy Bluecopa solutions for R2R, treasury, and reporting workflows.


Required Qualifications

  • Bachelor’s degree Commerce background / MBA Finance (mandatory).
  • 3+ years of hands-on implementation/project management experience
  • Proven experience delivering projects in Fintech, SaaS, or ERP environments
  • Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
  • Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
  • Experience working with ETL pipelines or data migration processes
  • Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
  • Strong communication and stakeholder management skills
  • Ability to manage multiple projects simultaneously and drive client success

Preferred Qualifications

  • Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
  • Familiarity with API integrations and basic data mapping
  • Experience in agile/scrum-based implementation environments
  • Exposure to reconciliation, book closure, AR/AP, and reporting systems
  • PMP, CSM, or similar certifications


Skills & Competencies

Functional Skills

  • Financial process knowledge (e.g., reconciliation, accounting, reporting)
  • Business analysis and solutioning
  • Client onboarding and training
  • UAT coordination
  • Documentation and SOP creation

Project Skills

  • Project planning and risk management
  • Task prioritization and resource coordination
  • KPI tracking and stakeholder reporting

Soft Skills

  • Cross-functional collaboration
  • Communication with technical and non-technical teams
  • Attention to detail and customer empathy
  • Conflict resolution and crisis management


What We Offer

  • An opportunity to shape fintech implementations across fast-growing companies
  • Work in a dynamic environment with cross-functional experts
  • Competitive compensation and rapid career growth
  • A collaborative and meritocratic culture


Read more
Capace Software Private Limited
Bhopal, Bengaluru (Bangalore)
7 - 13 yrs
₹9L - ₹12L / yr
Android
skill iconAndroid Development
frontend
Backend testing
fintech
+16 more

Job Description -Technical Project Manager

Job Title: Technical Project Manager

Location: Bhopal / Bangalore (On-site)

Experience Required: 7+ Years

Industry: Fintech / SaaS / Software Development

Role Overview

We are looking for a Technical Project Manager (TPM) who can bridge the gap between management and developers. The TPM will manage Android, Frontend, and Backend teams, ensure smooth development processes, track progress, evaluate output quality, resolve technical issues, and deliver timely reports.

Key Responsibilities

Project & Team Management

  • Manage daily tasks for Android, Frontend, and Backend developers
  • Conduct daily stand-ups, weekly planning, and reviews
  • Track progress, identify blockers, and ensure timely delivery
  • Maintain sprint boards, task estimations, and timelines

Technical Requirement Translation

  • Convert business requirements into technical tasks
  • Communicate requirements clearly to developers
  • Create user stories, flow diagrams, and PRDs
  • Ensure requirements are understood and implemented correctly

Quality & Build Review

  • Validate build quality, UI/UX flow, functionality
  • Check API integrations, errors, performance issues
  • Ensure coding practices and architecture guidelines are followed
  • Perform preliminary QA before handover to testing or clients

Issue Resolution

  • Identify development issues early
  • Coordinate with developers to fix bugs
  • Escalate major issues to founders with clear insights

Reporting & Documentation

  • Daily/weekly reports to management
  • Sprint documentation, release notes
  • Maintain project documentation & version control processes

Cross-Team Communication

  • Act as the single point of contact for management
  • Align multiple tech teams with business goals
  • Coordinate with HR and operations for resource planning

Required Skills

  • Strong understanding of Android, Web (Frontend/React), Backend development flows
  • Knowledge of APIs, Git, CI/CD, basic testing
  • Experience with Agile/Scrum methodologies
  • Ability to review builds and suggest improvements
  • Strong documentation skills (Jira, Notion, Trello, Asana)
  • Excellent communication & leadership
  • Ability to handle pressure and multiple projects

Good to Have

  • Prior experience in Fintech projects
  • Basic knowledge of UI/UX
  • Experience in preparing FSD/BRD/PRD
  • QA experience or understanding of test cases

Salary Range: 9 to 12 LPA

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mangalore, Pune, Mumbai
3 - 5 yrs
₹8L - ₹11L / yr
skill iconData Analytics
SQL
Relational Database (RDBMS)
skill iconJava
skill iconPython
+1 more

Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills

 

Review Criteria

· Must have Strong SQL skills (queries, optimization, procedures, triggers)

· Must have Advanced Excel skills

· Should have 3+ years of relevant experience

· Should have Reporting + dashboard creation experience

· Should have Database development & maintenance experience

· Must have Strong communication for client interactions

· Should have Ability to work independently

· Willingness to work from client locations.

 

Description

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?

As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?

For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations

 

Read more
Intellipro
Arthy R
Posted by Arthy R
Bengaluru (Bangalore), Chennai
3 - 7 yrs
₹10L - ₹18L / yr
Delphi
SQL

📢 Hiring: Delphi Developer – 6 Months Contract

Locations: Chennai & Bangalore | Immediate Joiners | Service-Based Project

We are hiring experienced Delphi Developers for a 6-month contractual role with a reputed service-based IT organization. Candidates with strong Delphi expertise who can contribute independently in a fast-paced environment are encouraged to apply.


🔧 Key Highlights

3–7 years of experience in software development

Strong hands-on experience in Delphi

Proficiency in SQL, ADO, and understanding of OOP, data structures, and design patterns

Exposure to JavaScript frameworks (Knockout/Angular) and modern UI concepts

Good communication, analytical, and problem-solving skills

Ability to work independently and multitask effectively

Preferred: Experience in Payments, Retail, EMV, C-Store, or Logistics domains


📍 Locations: Chennai & Bangalore

⏳ Contract Duration: 6 Months

🚀 Start Date: Immediate


Read more
Ladera Technology
Bengaluru (Bangalore)
7 - 11 yrs
₹10L - ₹22L / yr
skill iconJava
skill iconSpring Boot
Spring Security
APM
AWS Lambda
+9 more

Job Title: Software Developer (7-10 Years Experience)


Job Summary: We are seeking an experienced Software Developer with 7-10 years of hands-on development expertise in designing, building, and maintaining enterprise level applications and scalable APIs. Key


Responsibilities:

• Design, develop, and maintain microservices based applications using the Spring framework.

• Build secure, scalable REST and SOAP web services.

• Implement API security protocols including OAuth, JWT, SSL/TLS, X.509 certificates, and SAML, mTLS.

• Develop and deploy applications by leveraging AWS services such as EC2, Lambda, API Gateway, SQS, S3, SNS.

• Work with Azure cloud services and OpenShift for deployment and orchestration.

• Integrate JMS/messaging systems and work with middleware technologies such as MQ.

• Utilize SQL and NoSQL databases, including MySQL, PostgreSQL, and DynamoDB.

• Work with Netflix Conductor or Zuul for orchestration and routing.

• Collaborate with cross functional teams to deliver robust solutions in an Agile setup.


Required Skills:

• Strong JAVA OOPS fundamentals.

• Strong proficiency in Spring Framework (Spring Boot, Spring Cloud, Spring Security).

• Solid experience in microservices architecture.

• Handson experience with AWS cloud and OpenShift ecosystem.

• Familiarity with Azure services.

• Strong understanding of API security mechanisms.

• Expertise in building RESTful APIs.

• Experience working with SQL/NoSQL databases.

• Should have worked on integration with AppDynamics or similar APM tools

• Strong analytical and problem-solving skills.

Good to have skills:

• SOAP web services and graphQL

• Experience with JMS, messaging middleware, and MQ.


Qualifications:

• Bachelor’s or Master's degree in computer science or related field.

• 7-10 years of experience in backend development or full Stack development roles. 

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Kochi (Cochin), Mumbai, Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹17L / yr
Project Management
skill iconData Analytics
Program Management
SQL
Client Management
+7 more

Required Skills: Project Management, Data Analysis, SQL queries, Client Engagement

 

Criteria:

  • Must have 3+ years of project/program management experience in Financial Services/Banking/NBFC/Fintech companies only.
  • Hands-on proficiency in data analysis and SQL querying, with ability to work on large datasets
  • Ability to lead end-to-end implementation projects and manage cross-functional teams effectively.
  • Experience in process analysis, optimization, and mapping for operational efficiency.
  • Strong client-facing communication and stakeholder management capabilities.
  • Good expertise in financial operations processes and workflows with proven implementation experience.

 

Description

Position Overview:

We are seeking a dynamic and experienced Technical Program Manager to join our team. The successful candidate will be responsible for managing the implementation of company’s solutions at existing and new clients. This role requires a deep understanding of financial operation processes, exceptional problem-solving skills, and the ability to analyze large volumes of data. The Technical Program manager will drive process excellence and ensure outstanding customer satisfaction throughout the implementation lifecycle and beyond.

 

Key Responsibilities:

● Client Engagement: Serve as the primary point of contact for assigned clients, understanding their unique operation processes and requirements. Build and maintain strong relationships to facilitate successful implementations.

● Project Management: Lead the end-to-end implementation of company’s solutions, ensuring projects are delivered on time, within scope, and within budget. Coordinate with cross-functional teams to align resources and objectives.

● Process Analysis and Improvement: Evaluate clients' existing operation workflows, identify inefficiencies, and recommend optimized processes leveraging company’s platform. Utilize process mapping and data analysis to drive continuous improvement.

● Data Analysis: Analyze substantial datasets to ensure accurate configuration and integration of company’s solutions. Employ statistical tools and SQL-based queries to interpret data and provide actionable insights.

● Problem Solving: Break down complex problems into manageable components, developing effective solutions in collaboration with clients and internal teams.

● Process Excellence: Advocate for and implement best practices in process management, utilizing methodologies such as Lean Six Sigma to enhance operational efficiency.

● Customer Excellence: Ensure a superior customer experience by proactively addressing client needs, providing training and support, and promptly resolving any issues that arise.

 

Qualifications:

● Minimum of 3 years of experience in project management, preferably in financial services, software implementation, consulting or analytics.

● Strong analytical skills with experience in data analysis, SQL querying, and handling large datasets.

● Excellent communication and interpersonal skills, with the ability to manage client relationships effectively.

● Demonstrated ability to lead cross-functional teams and manage multiple projects concurrently.

● Proven expertise in financial operation processes and related software solutions is a plus

● Proficiency in developing business intelligence solutions or with low-code tools is a plus

 

Why Join company?

● Opportunity to work with a cutting-edge financial technology company.

● Collaborative and innovative work environment.

● Competitive compensation and benefits package.

● Professional development and growth opportunities.

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Mumbai, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Indore, Bengaluru (Bangalore)
4 - 7 yrs
₹4L - ₹10L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
Hibernate (Java)

Job Description

Role: Java Developer

Location: PAN India

Experience:4+ Years

Required Skills -

  1. 3+ years Java development experience
  2. Spring Boot framework expertise (MANDATORY)
  3. Microservices architecture design & implementation (MANDATORY)
  4. Hibernate/JPA for database operations (MANDATORY)
  5. RESTful API development (MANDATORY)
  6. Database design and optimization (MANDATORY)
  7. Container technologies (Docker/Kubernetes)
  8. Cloud platforms experience (AWS/Azure)
  9. CI/CD pipeline implementation
  10. Code review and quality assurance
  11. Problem-solving and debugging skills
  12. Agile/Scrum methodology
  13. Version control systems (Git)


Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹40L / yr
Oracle
Oracle Data Integrator
Oracle ERP
Implementation
Process automation
+30 more

Review Criteria

  • Strong Oracle Integration Cloud (OIC) Implementation profile
  • 5+ years in enterprise integration / middleware roles, with minimum 3+ years of hands-on Oracle Integration Cloud (OIC) implementation experience
  • Strong experience designing and delivering integrations using OIC Integrations, Adapters (File, FTP, DB, SOAP/REST, Oracle ERP), Orchestrations, Mappings, Process Automation, Visual Builder (VBCS), and OIC Insight/Monitoring
  • Proven experience building integrations across Oracle Fusion/ERP/HCM, Salesforce, on-prem systems (AS/400, JDE), APIs, file feeds (FBDI/HDL), databases, and third-party SaaS.
  • Strong expertise in REST/JSON, SOAP/XML, WSDL, XSD, XPath, XSLT, JSON Schema, and web-service–based integrations
  • Good working knowledge of OCI components (API Gateway, Vault, Autonomous DB) and hybrid integration patterns
  • Strong SQL & PL/SQL skills for debugging, data manipulation, and integration troubleshooting
  • Hands-on experience owning end-to-end integration delivery including architecture reviews, deployments, versioning, CI/CD of OIC artifacts, automated testing, environment migrations (Dev→Test→Prod), integration governance, reusable patterns, error-handling frameworks, and observability using OIC/OCI monitoring & logging tools
  • Experience providing technical leadership, reviewing integration designs/code, and mentoring integration developers; must be comfortable driving RCA, performance tuning, and production issue resolution
  • Strong stakeholder management, communication (written + verbal), problem-solving, and ability to collaborate with business/product/architect teams

 

Preferred

  • Preferred (Certification) – Oracle OIC or Oracle Cloud certification
  • Preferred (Domain Exposure) – Experience with Oracle Fusion functional modules (Finance, SCM, HCM), business events/REST APIs, SOA/OSB background, or multi-tenant/API-governed integration environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Oracle Integration Cloud (OIC)?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

Company is seeking an experienced OIC Lead to own the design, development and deployment of enterprise integrations. The ideal candidate will have atleast 6+years of prior experience in various integration technologies, with a good experience implementing OIC integration capabilities. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.

 

Responsibilities:

  • Lead the design and delivery of integration solutions using Oracle Integration Cloud (Integration, Process Automation, Visual Builder, Insight) and related Oracle PaaS components.
  • Build and maintain integrations between Oracle Fusion/ERP/HCM, Salesforce, on-prem applications (e.g., AS/400, JDE), APIs, file feeds (FBDI/HDL), databases and third-party SaaS.
  • Own end-to-end integration delivery - from architecture/design reviews through deployment, monitoring, and post-production support.
  • Create reusable integration patterns, error-handling frameworks, security patterns (OAuth2, client credentials), and governance for APIs and integrations.
  • Own CI/CD, versioning and migration of OIC artifacts across environments (Dev → Test → Prod); implement automated tests and promotion pipelines.
  • Define integration architecture standards and reference patterns for hybrid (cloud/on-prem) deployments.
  • Ensure security, scalability, and fault tolerance are built into all integration designs.
  • Drive performance tuning, monitoring and incident response for integrations; implement observability using OIC/OCI monitoring and logging tools.
  • Provide technical leadership and mentorship to a team of integration developers; review designs and code; run hands-on troubleshooting and production support rotations.
  • Work with business stakeholders, product owners and solution architects to translate requirements into integration designs, data mappings and runbooks

 

Ideal Candidate

  • 5+ years in integration/enterprise middleware roles with at least 3+ years hands-on OIC (Oracle Integration Cloud) implementations.
  • Strong experience with OIC components: Integrations, Adapters (File, FTP, Database, SOAP, REST, Oracle ERP), Orchestrations/Maps, OIC Insight/Monitoring, Visual Builder (VBCS) or similar
  • Expert in web services and message formats: REST/JSON, SOAP/XML, WSDL, XSD, XPath, XSLT, JSON Schema
  • Good knowledge of Oracle Cloud stack / OCI (API Gateway, Vault, Autonomous DB) and on-prem integration patterns
  • SQL & PL/SQL skills for data manipulation and troubleshooting; exposure to FBDI/HDL (for bulk loads) is desirable
  • Strong problem-solving, stakeholder management, written/verbal communication and team mentoring experience

 

Nice-to-have / Preferred:

  • Oracle OIC certification(s) or Oracle Cloud certifications
  • Exposure to OCI services (API Gateway, Vault, Monitoring) and Autonomous Database
  • Experience with Oracle Fusion functional areas (Finance, Supply Chain, HCM) and business events/REST APIs preferred.
  • Background with SOA Suite/Oracle Service Bus (useful if migrating legacy SOA to OIC)
  • Experience designing multi-tenant integrations, rate limiting/throttling and API monetization strategies.


Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 10 yrs
₹20L - ₹45L / yr
Data architecture
Data engineering
SQL
Data modeling
GCS
+21 more

Review Criteria

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred

  • Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Dremio?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate

  • Bachelor’s or master’s in computer science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.
Read more
Banking Industry

Banking Industry

Agency job
via Jobdost by Saida Pathan
Mangalore, Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹10L / yr
SQL
Dashboard
skill iconData Analytics
Database Development

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?


As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?


For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations


Read more
ONEPOS RETAIL SOLUTIONS PVT LTD
Bengaluru (Bangalore)
8 - 12 yrs
₹15L - ₹18L / yr
POS
Payment gateways
Selenium
JIRA
API
+6 more

Role Overview


The Automation Lead for the Point of Sale (POS) business is responsible for driving end-to-end automation strategy, framework development, and quality governance across POS applications, devices, and integrations. This role ensures high-quality releases by designing scalable automation solutions tailored to payment systems, in-store hardware, peripherals, and complex retail workflows.


You will lead a team of automation engineers, collaborate closely with product, development, and operations teams, and play a key role in accelerating delivery through optimized test coverage and robust automation pipelines.


Key Responsibilities


1. Automation Strategy & Leadership

•          Define and own the automation roadmap for POS systems (frontend UI, backend services, device interactions).

•          Lead, mentor, and upskill a team of automation engineers.

•          Establish automation KPIs (coverage, stability, execution time) and ensure continuous improvement.

•          Identify opportunities to improve automation maturity across the POS ecosystem.


2. Framework Architecture & Development

•          Design and build scalable, reusable automation frameworks for web, mobile ( IOS & Android), and device-level POS testing.

•          Integrate automation with CI/CD pipelines (Jenkins, GitHub Actions, Azure DevOps, etc.).

•          Implement best practices in coding standards, version control, and documentation.

•          Ensure automation solutions support multi-platform POS devices (payment terminals, printers, scanners, cash drawers, tablets).


3. Functional & Non-Functional Test Automation

•          Automate regression, smoke, and integration test suites for POS workflows (transactions, refunds, offline mode, sync, etc.).

•          Collaborate with performance and security teams to enable load, stress, and penetration testing automation.

•          Drive automation for API, UI, database, and hardware integration layers.


4. Quality Governance & Cross-Functional Collaboration

•          Work closely with product owners, business analysts, and developers to understand POS requirements.

•          Define test strategy, test plans, and automation coverage for each release.

•          Advocate for early testing, shift-left practices, and robust quality gates.

•          Manage defect triage and root cause analysis for automation-related issues.


5. POS Hardware & Integration Expertise

•          Ensure validation of POS peripherals (MSR,NCR, Verifone,  barcode scanners, EMV payment terminals, printers).

•          Support automation for cloud-hosted and on-prem POS systems.

•          Collaborate with vendors on device certifications and compliance (PCI, EMV, L3, etc.).


Required Skills & Experience


Technical Skills

•          Strong experience in automation tools/frameworks:

•          Selenium, Appium, Playwright, Cypress, TestNG, Junit or similar

•          REST API automation (Postman/Newman, RestAssured, Karate, Swagger, etc.)

•          Python/Java/JavaScript/C# for automation scripting

•          Experience in retail/POS/fintech/payment systems.

•          Experience with CI/CD tools and version control (Git).

•          Knowledge of POS hardware and device interaction automation.

•          Good understanding of microservices architecture and system integrations.

•          Experience working with SQL for data validation and backend testing.

•          Experience with bug tracking tools like JIRA , Azure Devops.


Leadership & Soft Skills

•          8–12 years overall experience, with at least 1- 2 years in a lead or senior automation role.

•          Ability to lead distributed teams.

•          Strong problem-solving, debugging, and analytical skills.

•          Excellent communication and stakeholder management.

•          Ability to work in a fast-paced, release-driven retail environment.

Preferred Qualifications

•          Experience in cloud-based POS platforms (AWS/Azure/GCP).

•          Exposure to payment certification testing (EMV L2/L3, PCI).

•          Knowledge of performance testing tools (JMeter, k6).

•          Experience with containerization (Docker, Kubernetes).

•          ISTQB, CSTE or other QA/Automation certifications.


What You Will Drive

•          Faster releases through automation-first delivery.

•          Improved POS reliability across devices and store environments.

•          Highly stable regression suites enabling continuous deployment.

•          A culture of quality across the POS engineering organization.


Why Join Us?

  • Work on industry-leading POS and payment systems.
  • Collaborative, inclusive, and innovative team culture.
  • Competitive compensation and benefits package.
  • Opportunities for growth and learning in a dynamic environment.


Read more
Furrl
Sricharan KS
Posted by Sricharan KS
Bengaluru (Bangalore)
0 - 2 yrs
₹25000 - ₹35000 / mo
SQL

About Furrl

Furrl is a high scale discovery experience for new-age D2C brands. Furrl is breaking the clutter of over 100,000 such brands through a novel #Vibe-based discovery experience and attacking a USD 100 billion market. This asset-light platform is a global first-of-its-kind, and rapid growth and customer love have already demonstrated early product-market-fit.

We’re looking for a Product Intern who enjoys working with data and is curious about how digital products are built.

Your main job will be to help the Product team understand user behaviour, pull data, and share insights that improve the app.

This is perfect for students who enjoy problem-solving, numbers, and learning how product teams work in real life.


Location: HSR Layout, Bangalore 


Responsibilities

  • Run SQL queries to pull data
  • Work with the Product team to analyse trends and numbers
  • Create simple reports and dashboards
  • Help the team with product experiments (like A/B tests)
  • Support in documentation and basic product research

Requirements 

  • Basic SQL knowledge (very important)
  • Comfortable working with numbers
  • Curious, responsible, and eager to learn
  • Good communication skills
  • Any analytics tools (GA, Mixpanel, etc.) (Not mandatory, but good if you already know)
  • Previous project or mini-internship experience 

What to get excited about 

  • Be a part of a strong early team building a massive business.
  • Work directly with the CEO and other leadership team members, who are well-respected industry experts.
  • Get the rare chance to see a 0 to 1 journey and be a key member of that journey.
  • Accelerate your career with a rapid growth path within the organisation.
  • Strong possibility of PPO (Pre-Placement Offer) based on performance.



Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Bengaluru (Bangalore), Hyderabad, Mumbai, Gurugram
2 - 8 yrs
₹10L - ₹35L / yr
GCP
skill iconPython
SQL
Google Cloud Platform (GCP)

Responsibilities:

Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)

Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views

Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration

Implement SQL-based transformations using Dataform (or dbt)

Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture

Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability

Partner with solution architects and product teams to translate data requirements into technical designs

Mentor junior data engineers and support knowledge-sharing across the team

Contribute to documentation, code reviews, sprint planning, and agile ceremonies

Requirements

2+ years of hands-on experience in data engineering, with at least 2 years on GCP

Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)

Strong programming skills in Python and/or Java

Experience with SQL optimization, data modeling, and pipeline orchestration

Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks

Exposure to Dataform, dbt, or similar tools for ELT workflows

Solid understanding of data architecture, schema design, and performance tuning

Excellent problem-solving and collaboration skills

Bonus Skills:

GCP Professional Data Engineer certification

Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures

Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)

Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Gurugram, Bhopal, Jaipur, Bengaluru (Bangalore)
2 - 4 yrs
₹5L - ₹12L / yr
Windows Azure
SQL
Data Structures
databricks

 Hiring: Azure Data Engineer

⭐ Experience: 2+ Years

📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

Passport: Mandatory & Valid

(Only immediate joiners & candidates serving notice period)


Mandatory Skills:

Azure Synapse, Azure Databricks, Azure Data Factory (ADF), SQL, Delta Lake, ADLS, ETL/ELT,Pyspark .


Responsibilities:

  • Build and maintain data pipelines using ADF, Databricks, and Synapse.
  • Develop ETL/ELT workflows and optimize SQL queries.
  • Implement Delta Lake for scalable lakehouse architecture.
  • Create Synapse data models and Spark/Databricks notebooks.
  • Ensure data quality, performance, and security.
  • Collaborate with cross-functional teams on data requirements.


Nice to Have:

Azure DevOps, Python, Streaming (Event Hub/Kafka), Power BI, Azure certifications (DP-203).


Read more
lulu international

lulu international

Agency job
via Episeio Business Solutions by Praveen Saulam
Bengaluru (Bangalore)
2.5 - 3 yrs
₹7L - ₹9L / yr
SQL
PySpark
databricks
Hypothesis testing
ANOVA gauge R&R

Role Overview

As a Lead Data Scientist / Data Analyst, you’ll combine analytical thinking, business acumen, and technical expertise to design and deliver impactful data-driven solutions. You’ll lead analytical problem-solving for retail clients — from data exploration and visualisation to predictive modelling and actionable business insights.

 

Key Responsibilities

  • Partner with business stakeholders to understand problems and translate them into analytical solutions.
  • Lead end-to-end analytics projects — from hypothesis framing and data wrangling to insight delivery and model implementation.
  • Drive exploratory data analysis (EDA), identify patterns/trends, and derive meaningful business stories from data.
  • Design and implement statistical and machine learning models (e.g., segmentation, propensity, CLTV, price/promo optimisation).
  • Build and automate dashboards, KPI frameworks, and reports for ongoing business monitoring.
  • Collaborate with data engineering and product teams to deploy solutions in production environments.
  • Present complex analyses in a clear, business-oriented way, influencing decision-making across retail categories.
  • Promote an agile, experiment-driven approach to analytics delivery.

 

Common Use Cases You’ll Work On

  • Customer segmentation (RFM, mission-based, behavioural)
  • Price and promo effectiveness
  • Assortment and space optimisation
  • CLTV and churn prediction
  • Store performance analytics and benchmarking
  • Campaign measurement and targeting
  • Category in-depth reviews and presentation to L1 leadership team

 

Required Skills and Experience

  • 3+ years of experience in data science, analytics, or consulting (preferably in the retail domain)
  • Proven ability to connect business questions to analytical solutions and communicate insights effectively
  • Strong SQL skills for data manipulation and querying large datasets
  • Advanced Python for statistical analysis, machine learning, and data processing
  • Intermediate PySpark / Databricks skills for working with big data
  • Comfortable with data visualisation tools (Power BI, Tableau, or similar)
  • Knowledge of statistical techniques (Hypothesis testing, ANOVA, regression, A/B testing, etc.)
  • Familiarity with agile project management tools (JIRA, Trello, etc.)

 

Good to Have

  • Experience designing data pipelines or analytical workflows in cloud environments (Azure preferred)
  • Strong understanding of retail KPIs (sales, margin, penetration, conversion, ATV, UPT, etc.)
  • Prior exposure to Promotion or Pricing analytics 
  • Dashboard development or reporting automation expertise


Read more
Hyderabad, Bengaluru (Bangalore)
5 - 12 yrs
₹25L - ₹35L / yr
skill iconC#
SQL
skill iconAmazon Web Services (AWS)
skill icon.NET
skill iconJava
+3 more

Senior Software Engineer

Location: Hyderabad, India


Who We Are:

Since our inception back in 2006, Navitas has grown to be an industry leader in the digital transformation space, and we’ve served as trusted advisors supporting our client base within the commercial, federal, and state and local markets.


What We Do:

At our very core, we’re a group of problem solvers providing our award-winning technology solutions to drive digital acceleration for our customers! With proven solutions, award-winning technologies, and a team of expert problem solvers, Navitas has consistently empowered customers to use technology as a competitive advantage and deliver cutting-edge transformative solutions.


What You’ll Do:

Build, Innovate, and Own:

  • Design, develop, and maintain high-performance microservices in a modern .NET/C# environment.
  • Architect and optimize data pipelines and storage solutions that power our AI-driven products.
  • Collaborate closely with AI and data teams to bring machine learning models into production systems.
  • Build integrations with external services and APIs to enable scalable, interoperable solutions.
  • Ensure robust security, scalability, and observability across distributed systems.
  • Stay ahead of the curve — evaluating emerging technologies and contributing to architectural decisions for our next-gen platform.

Responsibilities will include but are not limited to:

  • Provide technical guidance and code reviews that raise the bar for quality and performance.
  • Help create a growth-minded engineering culture that encourages experimentation, learning, and accountability.

What You’ll Need:

  • Bachelor’s degree in Computer Science or equivalent practical experience.
  • 8+ years of professional experience, including 5+ years designing and maintaining scalable backend systems using C#/.NET and microservices architecture.
  • Strong experience with SQL and NoSQL data stores.
  • Solid hands-on knowledge of cloud platforms (AWS, GCP, or Azure).
  • Proven ability to design for performance, reliability, and security in data-intensive systems.
  • Excellent communication skills and ability to work effectively in a global, cross-functional environment.

Set Yourself Apart With:

  • Startup experience - specifically in building product from 0-1
  • Exposure to AI/ML-powered systems, data engineering, or large-scale data processing.
  • Experience in healthcare or fintech domains.
  • Familiarity with modern DevOps practices, CI/CD pipelines, and containerization (Docker/Kubernetes).

Equal Employer/Veterans/Disabled

Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.

Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navita

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Nishita Bangera
Posted by Nishita Bangera
Bengaluru (Bangalore)
4 - 7 yrs
₹5L - ₹25L / yr
skill iconPython
skill iconDjango
skill iconFlask
SQL
skill iconAmazon Web Services (AWS)
+1 more

🔧 Key Skills

  • Strong expertise in Python (3.x)
  • Experience with Django / Flask / FastAPI
  • Good understanding of Microservices & RESTful API development
  • Proficiency in MySQL/PostgreSQL – queries, stored procedures, optimization
  • Solid grip on Data Structures & Algorithms (DSA)
  • Comfortable working with Linux & Windows environments
  • Hands-on experience with Git, CI/CD (Jenkins/GitHub Actions)
  • Familiarity with Docker / Kubernetes is a plus


Read more
Sigmoid

at Sigmoid

1 video
4 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
3 - 5 yrs
Upto ₹25L / yr (Varies
)
PySpark
SQL
skill iconPython
Windows Azure
skill iconAmazon Web Services (AWS)
+2 more

You will be responsible for building a highly-scalable and extensible robust application. This position reports to the Engineering Manager.


Responsibilities:

  • Align Sigmoid with key Client initiatives
  • Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
  • Ability to understand business requirements and tie them to technology solutions
  • Open to work from client location as per the demand of the project / customer.
  • Facilitate in Technical Aspects
  • Develop and evolve highly scalable and fault-tolerant distributed components using Java technologies.
  • Excellent experience in Application development and support, integration development and quality assurance.
  • Provide technical leadership and manage it day to day basis
  • Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
  • Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
  • Hands on coder with good understanding on enterprise level code.
  • Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
  • Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
  • Culture
  • Must be a strategic thinker with the ability to think unconventional / out:of:box.
  • Analytical and solution driven orientation.
  • Raw intellect, talent and energy are critical.
  • Entrepreneurial and Agile : understands the demands of a private, high growth company.
  • Ability to be both a leader and hands on "doer".

 

Qualifications: -

  • 3-5 year track record of relevant work experience and a computer Science or a related technical discipline is required
  • Experience in development of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, and propose comprehensive solutions.
  • Experience with functional and object-oriented programming, Java (Preferred) or Python is a must.
  • Hand-On knowledge in Map Reduce, Hadoop, PySpark, Hbase and ElasticSearch.
  • Development and support experience in Big Data domain
  • Experience with database modelling and development, data mining and warehousing.
  • Unit, Integration and User Acceptance Testing.
  • Effective communication skills (both written and verbal)
  • Ability to collaborate with a diverse set of engineers, data scientists and product managers
  • Comfort in a fast-paced start-up environment.

 

Preferred Qualification:

  • Experience in Agile methodology.
  • Proficient with SQL and its variation among popular databases.
  • Experience working with large, complex data sets from a variety of sources.
Read more
Euphoric Thought Technologies
Sakshi Mittal
Posted by Sakshi Mittal
Bengaluru (Bangalore)
3 - 5 yrs
₹6L - ₹10L / yr
Automation
Selenium
skill iconJava
Manual testing
API
+5 more

Job Description for Automation QA

Key Responsibilities

● Test web and mobile applications / services, ensuring they meet high-quality standards. ● Conduct thorough testing of e-commerce platforms in the automobile domain (e.g., carwale.com, cars24.com). ● Perform backend REST API testing, ensuring correct data in databases and debugging issues through logs, network responses, and database validations. ● Collaborate with cross-functional teams (developers, product managers, DevOps) to define and execute comprehensive test plans and strategies. ● Analyze and debug integration workflows, particularly with third-party services such as payment gateways and authentication providers. ● Ensure exceptional frontend UI/UX quality with meticulous attention to detail. ● Write, execute, and maintain detailed test cases based on user stories and business requirements. ● Conduct regression, integration, and user acceptance testing (UAT) to validate product functionality. ● Monitor and analyze test results, report defects, and collaborate with developers for resolution. ● Use tools such as Postman, browser developer tools, and bug-tracking systems like JIRA effectively. ● Coordinate testing activities across multiple releases and environments. ● Facilitate test preparation, execution, and reporting while ensuring alignment with Agile frameworks. ● Maintain and update test documentation following requirement changes. ● Participate in daily stand-ups and sprint planning discussions, contributing to feature validation and delivery goals. ● Monitor and triage issues in collaboration with cross-functional teams to resolve them efficiently.

Required Skills & Qualifications

● 3+ years of experience in automation testing with hands-on exposure on web and backend testing, preferably in the e-commerce/automobile industry. ● Strong proficiency in testing tools like Postman, browser developer tools, and bug-tracking systems. ● Solid understanding of SQL, PostgreSQL, Python or MongoDB for data verification. ● Familiarity with async communication in service (e.g., AWS SQS, Apache Kafka) and debugging issues therein. ● Excellent knowledge of the software testing lifecycle (STLC) and Agile testing methodologies. ● Experience with version control systems like Git. ● Proven ability to debug issues in API integrations, logs, and databases. ● Strong communication and documentation skills for reporting bugs and preparing detailed test reports. ● Understanding of regression testing frameworks and expertise in functional and integration testing.

Additional Preferred Qualifications

● Experience with mobile testing frameworks and tools. ● Basic understanding of performance testing and debugging for optimized user experiences. ● Exposure to automation tools (not mandatory but advantageous).

Read more
Euphoric Thought Technologies
Bengaluru (Bangalore)
10 - 15 yrs
₹18L - ₹32L / yr
ASP.NET
MVC Framework
dot net core
skill iconAngular (2+)
Entity Framework
+6 more

Knowledge / Skills / Abilities


  • Bachelor’s Degree or equivalent in Computer Science or related numerate
  • Strong application development experience within a Microsoft .NET based environment
  • Demonstrates good written and verbal communication skills in leading a small group of Developers and liaising with Business and IT stakeholders on agreed delivery commitments
  • Demonstrates capability of proactive end-to-end ownership of product delivery including delivery to estimates, robust hosting, smooth release management, and consistency in quality to ensure overall satisfied product enhancement experience for the Product Owner and the wider, global end-user community
  • Demonstrates good proficiency with established ageing tech stack but has also gained familiarity with new technologies as per market trends, which will be key to mutual success for the candidate and the organization in context with ongoing Business and IT transformation.


Essential skills


  • Proficiency in C#, ASP.NET Webforms, MVC, WebAPI, jQuery, Angular, Entity Framework, SQL Server, XML, XSLT, JSON, .NET Core
  • Familiarity with WPF, WCF, SSIS, SSRS, Azure DevOps, Cloud Technologies (Azure)
  • Good analytical and communication skills
  • Proficiency with Agile and Waterfall methodologies


Desirable skills


Familiarity with hosting and server infrastructure

Required Experience 10+

Read more
Vola Finance

at Vola Finance

1 video
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
4yrs+
Upto ₹20L / yr (Varies
)
skill iconPython
FastAPI
RESTful APIs
GraphQL
skill iconAmazon Web Services (AWS)
+7 more

Python Backend Developer

We are seeking a skilled Python Backend Developer responsible for managing the interchange of data between the server and the users. Your primary focus will be on developing server-side logic to ensure high performance and responsiveness to requests from the front end. You will also be responsible for integrating front-end elements built by your coworkers into the application, as well as managing AWS resources.


Roles & Responsibilities

  • Develop and maintain scalable, secure, and robust backend services using Python
  • Design and implement RESTful APIs and/or GraphQL endpoints
  • Integrate user-facing elements developed by front-end developers with server-side logic
  • Write reusable, testable, and efficient code
  • Optimize components for maximum performance and scalability
  • Collaborate with front-end developers, DevOps engineers, and other team members
  • Troubleshoot and debug applications
  • Implement data storage solutions (e.g., PostgreSQL, MySQL, MongoDB)
  • Ensure security and data protection

Mandatory Technical Skill Set

  • Implementing optimal data storage (e.g., PostgreSQL, MySQL, MongoDB, S3)
  • Python backend development experience
  • Design, implement, and maintain CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, or GitHub Actions
  • Implemented and managed containerization platforms such as Docker and orchestration tools like Kubernetes
  • Previous hands-on experience in:
  • EC2, S3, ECS, EMR, VPC, Subnets, SQS, CloudWatch, CloudTrail, Lambda, SageMaker, RDS, SES, SNS, IAM, S3, Backup, AWS WAF
  • SQL
Read more
Corridor Platforms

at Corridor Platforms

3 recruiters
Aniket Agrawal
Posted by Aniket Agrawal
Bengaluru (Bangalore)
4 - 8 yrs
₹30L - ₹50L / yr
skill iconPython
PySpark
Apache Spark
NumPy
pandas
+8 more

About Corridor Platforms

Corridor Platforms is a leader in next-generation risk decisioning and responsible AI governance, empowering banks and lenders to build transparent, compliant, and data-driven solutions. Our platforms combine advanced analytics, real-time data integration, and GenAI to support complex financial decision workflows for regulated industries.

Role Overview

As a Backend Engineer at Corridor Platforms, you will:

  • Architect, develop, and maintain backend components for our Risk Decisioning Platform.
  • Build and orchestrate scalable backend services that automate, optimize, and monitor high-value credit and risk decisions in real time.
  • Integrate with ORM layers – such as SQLAlchemy – and multi RDBMS solutions (Postgres, MySQL, Oracle, MSSQL, etc) to ensure data integrity, scalability, and compliance.
  • Collaborate closely with Product Team, Data Scientists, QA Teams to create extensible APIs, workflow automation, and AI governance features.
  • Architect workflows for privacy, auditability, versioned traceability, and role-based access control, ensuring adherence to regulatory frameworks.
  • Take ownership from requirements to deployment, seeing your code deliver real impact in the lives of customers and end users.

Technical Skills

  • Languages: Python 3.9+, SQL, JavaScript/TypeScript, Angular
  • Frameworks: Flask, SQLAlchemy, Celery, Marshmallow, Apache Spark
  • Databases: PostgreSQL, Oracle, SQL Server, Redis
  • Tools: pytest, Docker, Git, Nx
  • Cloud: Experience with AWS, Azure, or GCP preferred
  • Monitoring: Familiarity with OpenTelemetry and logging frameworks


Why Join Us?

  • Cutting-Edge Tech: Work hands-on with the latest AI, cloud-native workflows, and big data tools—all within a single compliant platform.
  • End-to-End Impact: Contribute to mission-critical backend systems, from core data models to live production decision services.
  • Innovation at Scale: Engineer solutions that process vast data volumes, helping financial institutions innovate safely and effectively.
  • Mission-Driven: Join a passionate team advancing fair, transparent, and compliant risk decisioning at the forefront of fintech and AI governance.

What We’re Looking For

  • Proficiency in Python, SQLAlchemy (or similar ORM), and SQL databases.
  • Experience developing and maintaining scalable backend services, including API, data orchestration, ML workflows,  and workflow automation.
  • Solid understanding of data modeling, distributed systems, and backend architecture for regulated environments.
  • Curiosity and drive to work at the intersection of AI/ML, fintech, and regulatory technology.
  • Experience mentoring and guiding junior developers.


Ready to build backends that shape the future of decision intelligence and responsible AI?

Apply now and become part of the innovation at Corridor Platforms!



Read more
Albert Invent

at Albert Invent

4 candid answers
3 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
7 - 9 yrs
Upto ₹40L / yr (Varies
)
skill iconNodeJS (Node.js)
SQL
MySQL
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

To design, develop, and maintain highly scalable, secure, and efficient backend systems that

power core business applications. The Senior Engineer – Backend will be responsible for

architecting APIs, optimizing data flow, and ensuring system reliability and performance. This

role will collaborate closely with frontend, DevOps, and product teams to deliver robust

solutions that enable seamless user experiences and support organizational growth through

clean, maintainable, and well-tested code.


Responsibilities:

• Design, develop, and maintain robust and scalable backend services using Node.js.

• Collaborate with front-end developers and product managers to define and implement

API specifications.

• Optimize application performance and scalability by identifying bottlenecks and

proposing solutions.

• Write clean, maintainable, and efficient code, and conduct code reviews to ensure

quality standards.

• Develop unit tests and maintain code coverage to ensure high quality.

• Document architectural solutions and system designs to ensure clarity and

maintainability.

• Troubleshoot and resolve issues in development, testing, and production environments.

• Stay up to date with emerging technologies and industry trends to continuously improve

our tech stack.

• Mentor and guide junior engineers, fostering a culture of learning and growth.


Key Skills and Qualifications:

• Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent

experience).

• 7+ years of experience in backend development with a focus on Node.js and Javascript.

• Strong understanding of RESTful APIs and microservices architecture.

• Proficiency in database technologies (SQL and NoSQL, such as DynamoDB, MongoDB,

PostgreSQL, etc.).

• Familiarity with containerization and orchestration technologies (Docker, Kubernetes).

• Knowledge of cloud platform (AWS) and deployment best practices.

• Excellent problem-solving skills and the ability to work in a fast-paced environment.

• Strong communication and teamwork skills.


Good to have:

• Experience with front-end frameworks (e.g. Angular, React, Vue.js).

• Understanding of HTML, CSS, and JavaScript.

• Familiarity with responsive design and user experience principles.


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Bengaluru (Bangalore)
3 - 5 yrs
₹5L - ₹20L / yr
Automation
Manual testing
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
SQL
+4 more

🚀 Hiring: QA Engineer (Manual + Automation)

⭐ Experience: 3+ Years

📍 Location: Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


💫 About the Role:

We’re looking for a skilled QA Engineer You’ll ensure product quality through manual and automated testing across web, mobile, and APIs — working with tools and technologies like Postman, Playwright, Appium, Rest Assured, GCP/AWS, and React/Next.js.


Key Responsibilities:

✅ Develop & maintain automated tests using Cucumber, Playwright, Pytest, etc.

✅ Perform API testing using Postman.

✅ Work on cloud platforms (GCP/AWS) and CI/CD (Jenkins).

✅ Test web & mobile apps (Appium, BrowserStack, LambdaTest).

✅ Collaborate with developers to ensure seamless releases.


Must-Have Skills:

✅ API Testing (Postman)

✅ Cloud (GCP / AWS)

✅ Frontend understanding (React / Next.js)

✅ Strong SQL & Git skills

✅ Familiarity with OpenAI APIs


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
ASP.NET
SQL
skill iconAmazon Web Services (AWS)

Company Name – Wissen Technology

Location :  Pune / Bangalore / Mumbai (Based on candidate preference)

Work mode: Hybrid 

Experience: 5+ years


Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.


Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Education

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.


Read more
SaaSExplorers
Bengaluru (Bangalore)
8 - 12 yrs
₹12L - ₹18L / yr
SQL
SSJS
SFMC
Marketing Cloud
Marketing Automation
+3 more

We’re seeking a highly experienced and certified Salesforce Marketing Cloud (SFMC) Subject Matter Expert (SME) to lead strategic initiatives and bridge the gap between business stakeholders and technical teams. This role requires strong communication skills and the ability to translate business needs into scalable solutions. Experience with financial services industry in Asia market is an added advantage.


Key Responsibilities:


  • ⁠Serve as the primary liaison between marketing stakeholders and technical teams to ensure seamless campaign execution.
  • Architect, guide and support scalable solutions using SFMC modules like Journey Builder, Email Studio, Automation Studio, and Contact Builder.
  • ⁠Translate marketing goals into technical specifications and actionable workflows.
  • ⁠Lead integration efforts with CRM systems, data platforms, and third-party platforms.
  • ⁠Design and optimize customer journeys, segmentation strategies, and real-time personalization.
  • ⁠Ensure data governance, privacy compliance, and platform security best practices.
  • ⁠Conduct workshops, demos, and training sessions to drive platform adoption and maturity.
  • ⁠Stay ahead of Salesforce releases and innovations, especially within Marketing Cloud and Einstein AI.


Requirements:

  • 8–12 years of hands-on experience in Salesforce, with a strong focus on Marketing Cloud.
  • Salesforce certifications such as Marketing Cloud Consultant, Email Specialist, or Architect
  • ⁠Strong understanding of marketing automation, customer journeys, and campaign analytics.
  • Proficiency in AMPscript, SQL, SSJS, and data modeling within SFMC.
  • ⁠Experience with API integrations, SDKs, and event tracking across web and mobile platforms.
  • ⁠Familiarity with tools like Jira, Confluence, Tableau CRM, and CDP platforms is a plus.
  • ⁠Familiarity with other similar marketing platforms like Mo Engage is an added advantage.


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort