Cutshort logo
SQL Jobs in Bangalore (Bengaluru)

50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)

Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Agency job
via hirezyai by HR Hirezyai
Bengaluru (Bangalore), Chennai, Mumbai
5.5 - 12 yrs
₹15L - ₹25L / yr
skill iconAmazon Web Services (AWS)
PySpark
SQL

Roles & Responsibilities

  • Data Engineering Excellence: Design and implement data pipelines using formats like JSON, Parquet, CSV, and ORC, utilizing batch and streaming ingestion.
  • Cloud Data Migration Leadership: Lead cloud migration projects, developing scalable Spark pipelines.
  • Medallion Architecture: Implement Bronze, Silver, and gold tables for scalable data systems.
  • Spark Code Optimization: Optimize Spark code to ensure efficient cloud migration.
  • Data Modeling: Develop and maintain data models with strong governance practices.
  • Data Cataloging & Quality: Implement cataloging strategies with Unity Catalog to maintain high-quality data.
  • Delta Live Table Leadership: Lead the design and implementation of Delta Live Tables (DLT) pipelines for secure, tamper-resistant data management.
  • Customer Collaboration: Collaborate with clients to optimize cloud migrations and ensure best practices in design and governance.

Educational Qualifications

  • Experience: Minimum 5 years of hands-on experience in data engineering, with a proven track record in complex pipeline development and cloud-based data migration projects.
  • Education: Bachelor’s or higher degree in Computer Science, Data Engineering, or a related field.
  • Skills
  • Must-have: Proficiency in Spark, SQL, Python, and other relevant data processing technologies. Strong knowledge of Databricks and its components, including Delta Live Table (DLT) pipeline implementations. Expertise in on-premises to cloud Spark code optimization and Medallion Architecture.

Good to Have

  • Familiarity with AWS services (experience with additional cloud platforms like GCP or Azure is a plus).

Soft Skills

  • Excellent communication and collaboration skills, with the ability to work effectively with clients and internal teams.
  • Certifications
  • AWS/GCP/Azure Data Engineer Certification.


Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 4 yrs
₹6L - ₹12L / yr
skill iconPython
skill iconPostman
API
SQL

Company Description


eShipz is a rapidly expanding logistics automation platform designed to optimize shipping operations and enhance post-purchase customer experiences. The platform offers solutions such as multi-carrier integrations, real-time tracking, NDR management, returns, freight audits, and more. Trusted by over 350 businesses, eShipz provides easy-to-use analytics, automated shipping processes, and reliable customer support. As a trusted partner for eCommerce businesses and enterprises, eShipz delivers smarter, more efficient shipping solutions. Visit www.eshipz.com for more information.


Role Description



The Python Support Engineer role at eShipz requires supporting clients by providing technical solutions and resolving issues related to the platform. Responsibilities include troubleshooting reported problems, delivering technical support in a professional manner, and assisting with software functionality and operating systems. The engineer will also collaborate with internal teams to ensure a seamless customer experience. This is a full-time on-site role located in Sanjay Nagar, Greater Bengaluru Area.


Qualifications

  • Strong proficiency in Troubleshooting and Technical Support skills to identify and address software or technical challenges effectively.
  • Capability to provide professional Customer Support and Customer Service, ensuring high customer satisfaction and resolving inquiries promptly.
  • Proficiency and knowledge of Operating Systems to diagnose and resolve platform-specific issues efficiently.
  • Excellent problem-solving, communication, and interpersonal skills.
  • Bachelor's degree in computer science, IT, or a related field.
  • Experience working with Python and an understanding of backend systems is a plus.


  • Technical Skill:
  • Python Proficiency: Strong understanding of core Python (Data structures, decorators, generators, and exception handling).
  • Frameworks: Familiarity with web frameworks like Django, Flask, or FastAPI.
  • Databases: Proficiency in SQL (PostgreSQL/MySQL) and understanding of ORMs like SQLAlchemy or Django ORM.
  • Infrastructure: Basic knowledge of Linux/Unix commands, Docker, and CI/CD pipelines (Jenkins/GitHub Actions).
  • Version Control: Comfortable using Git for branching, merging, and pull requests.


  • Soft Skill:
  • Analytical Thinking: A logical approach to solving complex, "needle-in-a-haystack" problems.
  • Communication: Ability to explain technical concepts to both developers and end-users.
  • Patience & Empathy: Managing high-pressure situations when critical systems are down.


  • Work Location: Sanjay Nagar, Bangalore (WFO)


  • Work Timing :

  • Mon - Fri (WFO)(9:45 A.M. - 6: 15 P.M.)
  • 1st & 3rd SAT (WFO)(9:00 A.M. - 2:00 P.M.)
  • 2nd & 4th SAT (WFH)(9:00 A.M. - 2:00 P.M.)



Read more
Codemonk

at Codemonk

4 candid answers
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
7yrs+
Upto ₹42L / yr (Varies
)
skill iconNodeJS (Node.js)
skill iconPython
Google Cloud Platform (GCP)
RESTful APIs
SQL
+4 more

Like us, you'll be deeply committed to delivering impactful outcomes for customers.

  • 7+ years of demonstrated ability to develop resilient, high-performance, and scalable code tailored to application usage demands.
  • Ability to lead by example with hands-on development while managing project timelines and deliverables. Experience in agile methodologies and practices, including sprint planning and execution, to drive team performance and project success.
  • Deep expertise in Node.js, with experience in building and maintaining complex, production-grade RESTful APIs and backend services.
  • Experience writing batch/cron jobs using Python and Shell scripting.
  • Experience in web application development using JavaScript and JavaScript libraries.
  • Have a basic understanding of Typescript, JavaScript, HTML, CSS, JSON and REST based applications.
  • Experience/Familiarity with RDBMS and NoSQL Database technologies like MySQL, MongoDB, Redis, ElasticSearch and other similar databases.
  • Understanding of code versioning tools such as Git.
  • Understanding of building applications deployed on the cloud using Google cloud platform(GCP)or Amazon Web Services (AWS)
  • Experienced in JS-based build/Package tools like Grunt, Gulp, Bower, Webpack.
Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Bengaluru (Bangalore)
8 - 13 yrs
₹15L - ₹30L / yr
skill iconPython
PySpark
SQL
CI/CD
databricks
+1 more

Strong programming skills in Python and PySpark for large-scale data processing.


• Proficiency in SQL for data manipulation, analysis, and performance tuning.


• Experience with Dataform, Dataproc, and BigQuery for data pipeline development and orchestration.


• Hands-on experience with Kafka and Confluent for real-time data streaming.


• Knowledge of Cloud Scheduler and Dataflow for automation and workflow management.


• Familiarity with DBT, Machine Learning, and AI concepts is a good advantage.


• Understanding of Data Governance principles and implementation practices.


• Experience using Git for version control and CI/CD pipelines for automated deployments.


• Working knowledge of Infrastructure as Code (IaC) for cloud resource management and automation. 

Read more
Appsforbharat
Pooja V
Posted by Pooja V
Bengaluru (Bangalore)
6 - 13 yrs
₹30L - ₹40L / yr
skill iconGo Programming (Golang)
skill iconPython
skill iconAmazon Web Services (AWS)
SQL

About the role


We are seeking a seasoned Backend Tech Lead with deep expertise in Golang and Python to lead our backend team. The ideal candidate has 6+ years of experience in backend technologies and 2–3 years of proven engineering mentoring experience, having successfully scaled systems and shipped B2C applications in collaboration with product teams.

Responsibilities

Technical & Product Delivery

● Oversee design and development of backend systems operating at 10K+ RPM scale.

● Guide the team in building transactional systems (payments, orders, etc.) and behavioral systems (analytics, personalization, engagement tracking).

● Partner with product managers to scope, prioritize, and release B2C product features and applications.

● Ensure architectural best practices, high-quality code standards, and robust testing practices.

● Own delivery of projects end-to-end with a focus on scalability, reliability, and business impact.

Operational Excellence

● Champion observability, monitoring, and reliability across backend services.

● Continuously improve system performance, scalability, and resilience.

● Streamline development workflows and engineering processes for speed and quality.

Requirements

Experience:

7+ years of professional experience in backend technologies.

2-3 years as Tech lead and driving delivery.

● Technical Skills:

Strong hands-on expertise in Golang and Python.

Proven track record with high-scale systems (≥10K RPM).

Solid understanding of distributed systems, APIs, SQL/NoSQL databases, and cloud platforms.

Leadership Skills:

Demonstrated success in managing teams through 2–3 appraisal cycles.

Strong experience working with product managers to deliver consumer-facing applications.

● Excellent communication and stakeholder management abilities.

Nice-to-Have

● Familiarity with containerization and orchestration (Docker, Kubernetes).

● Experience with observability tools (Prometheus, Grafana, OpenTelemetry).

● Previous leadership experience in B2C product companies operating at scale.

What We Offer

● Opportunity to lead and shape a backend engineering team building at scale.

● A culture of ownership, innovation, and continuous learning.

● Competitive compensation, benefits, and career growth opportunities.

Read more
Global IT Consulting company

Global IT Consulting company

Agency job
via AccioJob by AccioJobHiring Board
Bengaluru (Bangalore)
0 - 1 yrs
₹11.1L - ₹11.1L / yr
DSA
SQL
Object Oriented Programming (OOPs)

AccioJob is conducting a Walk-In Hiring Drive with a Global IT Consulting company for the position of Software Engineer.


To apply, register and select your slot here: https://go.acciojob.com/buAvsp


Required Skills: DSA, SQL, OOPS


Eligibility:

  • Degree: BTech./BE
  • Branch: Computer Science/CSE/Other CS related branch, IT
  • Graduation Year: 2024, 2025


Work Details:

  • Work Location: Bangalore (Onsite)
  • CTC: 11.1 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Bangalore Centre, AccioJob Chennai Centre, AccioJob Hyderabad Centre, AccioJob Noida Centre, AccioJob Pune Centre


Further Rounds (for shortlisted candidates only):

Profile Evaluation, Coding Assignment, Technical Interview 1, Technical Interview 2, Technical Interview 3


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/buAvsp


FAST SLOT BOOKING

[ DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/79ra5s

Read more
snabbit

snabbit

Agency job
via AccioJob by AccioJobHiring Board
Bengaluru (Bangalore)
0 - 1 yrs
₹12L - ₹15L / yr
DSA
skill iconGit
skill iconPython
SQL

AccioJob is conducting a Walk-In Hiring Drive with Snabbit for the position of Java Full Stack Developer.


To apply, register and select your slot here: https://go.acciojob.com/BsqNWc


Required Skills: DSA, Git, Python, SQL


Eligibility:

  • Degree: BTech./BE, MTech./ME, MCA, BCA
  • Branch: All
  • Graduation Year: 2025, 2026


Work Details:

  • Work Location: Bangalore (Onsite)
  • CTC: ₹12 LPA to ₹15 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Bangalore Centre

Further Rounds (for shortlisted candidates only):

Resume Evaluation, Technical Interview 1, Technical Interview 2, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/BsqNWc


👇 FAST SLOT BOOKING 👇

[ 📲 DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/6dZYBw

Read more
Bidgely

at Bidgely

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
6yrs+
Upto ₹65L / yr (Varies
)
skill iconJava
skill iconSpring Boot
SQL
NOSQL Databases
skill iconAmazon Web Services (AWS)

Lead Software Engineer

Bidgely is seeking an exceptional and visionary Lead Software Engineer to join its core team in Bangalore. As a Lead Software Engineer, you will be working closely with EMs and org heads in shaping the roadmap and planning and set the technical direction for the team, influence architectural decisions, and mentor other engineers while delivering highly reliable, scalable products powered by large data, advanced machine learning models, and responsive user interfaces. Renowned for your deep technical expertise, you are capable of deconstructing any system, solving complex problems creatively, and elevating those around you. Join our innovative and dynamic team that thrives on creativity, technical excellence, and a belief that nothing is impossible with collaboration and hard work.


Responsibilities

  • Lead the design and delivery of complex, scalable web services, APIs, and backend data modules.
  • Define and drive adoption of best practices in system architecture, component reusability, and software design patterns across teams.
  • Provide technical leadership in product, architectural, and strategic engineering discussions.
  • Mentor and guide engineers at all levels, fostering a culture of learning and growth.
  • Collaborate with cross-functional teams (engineering, product management, data science, and UX) to translate business requirements into scalable, maintainable solutions.
  • Champion and drive continuous improvement initiatives for code quality, performance, security, and reliability.
  • Evaluate and implement emerging technologies, tools, and methodologies to ensure competitive advantage.
  • Present technical concepts and results clearly to both technical and non-technical stakeholders; influence organizational direction and recommend key technical investments.


Requirements

  • 6+ years of experience in designing and developing highly scalable backend and middle tier systems.
  • BS/MS/PhD in Computer Science or a related field from a leading institution.
  • Demonstrated mastery of data structures, algorithms, and system design; experience architecting large-scale distributed systems and leading significant engineering projects.
  • Deep fluency in Java, Spring, Hibernate, J2EE, RESTful services; expertise in at least one additional backend language/framework.
  • Strong hands-on experience with both SQL (e.g., MySQL, PostgreSQL) and NoSQL (e.g., MongoDB, Cassandra, Redis) databases, including schema design, optimization, and performance tuning for large data sets.
  • Experience with Distributed Systems, Cloud Architectures, CI/CD, and DevOps principles.
  • Strong leadership, mentoring, and communication skills; proven ability to drive technical vision and alignment across teams.
  • Track record of delivering solutions in fast-paced and dynamic start-up environments.
  • Commitment to quality, attention to detail, and a passion for coaching others.


Read more
Vola Finance

at Vola Finance

1 video
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
4yrs+
Upto ₹20L / yr (Varies
)
skill iconPython
FastAPI
RESTful APIs
GraphQL
skill iconAmazon Web Services (AWS)
+7 more

Python Backend Developer

We are seeking a skilled Python Backend Developer responsible for managing the interchange of data between the server and the users. Your primary focus will be on developing server-side logic to ensure high performance and responsiveness to requests from the front end. You will also be responsible for integrating front-end elements built by your coworkers into the application, as well as managing AWS resources.


Roles & Responsibilities

  • Develop and maintain scalable, secure, and robust backend services using Python
  • Design and implement RESTful APIs and/or GraphQL endpoints
  • Integrate user-facing elements developed by front-end developers with server-side logic
  • Write reusable, testable, and efficient code
  • Optimize components for maximum performance and scalability
  • Collaborate with front-end developers, DevOps engineers, and other team members
  • Troubleshoot and debug applications
  • Implement data storage solutions (e.g., PostgreSQL, MySQL, MongoDB)
  • Ensure security and data protection

Mandatory Technical Skill Set

  • Implementing optimal data storage (e.g., PostgreSQL, MySQL, MongoDB, S3)
  • Python backend development experience
  • Design, implement, and maintain CI/CD pipelines using tools such as Jenkins, GitLab CI/CD, or GitHub Actions
  • Implemented and managed containerization platforms such as Docker and orchestration tools like Kubernetes
  • Previous hands-on experience in:
  • EC2, S3, ECS, EMR, VPC, Subnets, SQS, CloudWatch, CloudTrail, Lambda, SageMaker, RDS, SES, SNS, IAM, S3, Backup, AWS WAF
  • SQL
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4 - 7 yrs
Best in industry
skill iconPython
pandas
NumPy
SQL
skill iconHTML/CSS
+4 more

Specific Knowledge/Skills


  1. 4-6 years of experience
  2. Proficiency in Python programming.
  3. Basic knowledge of front-end development.
  4. Basic knowledge of Data manipulation and analysis libraries
  5. Code versioning and collaboration. (Git)
  6. Knowledge for Libraries for extracting data from websites.
  7. Knowledge of SQL and NoSQL databases
  8. Familiarity with RESTful APIs
  9. Familiarity with Cloud (Azure /AWS) technologies
Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
6 - 9 yrs
₹30L - ₹48L / yr
skill iconPython
skill iconReact.js
skill iconNodeJS (Node.js)
TypeScript
ReAct (Reason + Act)
+13 more

Review Criteria:

  • Strong Software Engineer fullstack profile using NodeJS / Python and React
  • 6+ YOE in Software Development using Python OR NodeJS (For backend) & React (For frontend)
  • Must have strong experience in working on Typescript
  • Must have experience in message-based systems like Kafka, RabbitMq, Redis
  • Databases - PostgreSQL & NoSQL databases like MongoDB
  • Product Companies Only
  • Tier 1 Engineering Institutes (IIT, NIT, BITS, IIIT, DTU or equivalent)

 

Preferred:

  • Experience in Fin-Tech, Payment, POS and Retail products is highly preferred
  • Experience in mentoring, coaching the team.


Role & Responsibilities:

We are currently seeking a Senior Engineer to join our Financial Services team, contributing to the design and development of scalable system.

 

The Ideal Candidate Will Be Able To-

  • Take ownership of delivering performant, scalable and high-quality cloud-based software, both frontend and backend side.
  • Mentor team members to develop in line with product requirements.
  • Collaborate with Senior Architect for design and technology choices for product development roadmap.
  • Do code reviews.


Ideal Candidate:

  • Thorough knowledge of developing cloud-based software including backend APIs and react based frontend.
  • Thorough knowledge of scalable design patterns and message-based systems such as Kafka, RabbitMq, Redis, MongoDB, ORM, SQL etc.
  • Experience with AWS services such as S3, IAM, Lambda etc.
  • Expert level coding skills in Python FastAPI/Django, NodeJs, TypeScript, ReactJs.
  • Eye for user responsive designs on the frontend.


Perks, Benefits and Work Culture:

  • We prioritize people above all else. While we're recognized for our innovative technology solutions, it's our people who drive our success. That’s why we offer a comprehensive and competitive benefits package designed to support your well-being and growth:
  • Medical Insurance with coverage up to INR 8,00,000 for the employee and their family
Read more
Tarento Group

at Tarento Group

3 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
STOCKHOLM (Sweden), Bengaluru (Bangalore)
8yrs+
Best in industry
DevOps
Microsoft Windows Server
Microsoft IIS administration
Windows Azure
Powershell
+2 more

About Tarento:

Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions.

 

We're proud to be recognized as a Great Place to Work, a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you’ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose.


Scope of Work:

  • Support the migration of applications from Windows Server 2008 to Windows Server 2019 or 2022 in an IaaS environment.
  • Migrate IIS websites, Windows Services, and related application components.
  • Assist with migration considerations for SQL Server connections, instances, and basic data-related dependencies.
  • Evaluate and migrate message queues (MSMQ or equivalent technologies).
  • Document the existing environment, migration steps, and post-migration state.
  • Work closely with DevOps, development, and infrastructure teams throughout the project.


Required Skills & Experience:

  • Strong hands-on experience with IIS administration, configuration, and application migration.
  • Proven experience migrating workloads between Windows Server versions, ideally legacy to modern.
  • Knowledge of Windows Services setup, configuration, and troubleshooting.
  • Practical understanding of SQL Server (connection strings, service accounts, permissions).
  • Experience with queues IBM/MSMQ or similar) and their migration considerations.
  • Ability to identify migration risks, compatibility constraints, and remediation options.
  • Strong troubleshooting and analytical skills.
  • Familiarity with Microsoft technologies (.Net, etc)
  • Networking and Active Directory related knowledge

Desirable / Nice-to-Have

  • Exposure to CI/CD tools, especially TeamCity and Octopus Deploy.
  • Familiarity with Azure services and related tools (Terraform, etc)
  • PowerShell scripting for automation or configuration tasks.
  • Understanding enterprise change management and documentation practices.
  • Security

Soft Skills

  • Clear written and verbal communication.
  • Ability to work independently while collaborating with cross-functional teams.
  • Strong attention to detail and a structured approach to execution.
  • Troubleshooting
  • Willingness to learn.


Location & Engagement Details

We are looking for a Senior DevOps Consultant for an onsite role in Stockholm (Sundbyberg office). This opportunity is open to candidates currently based in Bengaluru who are willing to relocate to Sweden for the assignment.

The role will start with an initial 6-month onsite engagement, with the possibility of extension based on project requirements and performance.

Read more
CoverSelf Technologies

at CoverSelf Technologies

5 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
5 - 9 yrs
Upto ₹24L / yr (Varies
)
Selenium
skill iconJava
SQL
NOSQL Databases
Selenium Web driver
+1 more

Qualifications:

  • Must have a Bachelor’s degree in computer science or equivalent.
  • Must have at least 5+ years’ experience as a SDET.
  • At least 1+ year of leadership experience or managing a team.

Responsibilities:

  • Design, develop and execute automation scripts using open-source tools.
  • Troubleshooting any errors and streamlining the testing procedures.
  • Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
  • Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
  • Good time-management skills and commitment to meet deadlines.
  • Stay up-to-date with new testing tools and test strategies.
  • Driving technical projects and providing leadership in an innovative and fast-paced environment.

Requirements:

  • Experience in the Automation - API and UI as well as Manual Testing on Web Application.
  • Experience in frameworks like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
  • Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
  • Must be proficient in Core Java / Type Script and Java 17.
  • Experience in JUnit-5.
  • Good to have TypeScript experience.
  • Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
  • Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
  • Good understanding of software & systems architecture.
  • Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
  • Strong experience REST based components testing, back-end, DB and micro services testing.


Work Location: Jayanagar - Bangalore.

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Indore, Pune, Bhopal, Mumbai, Nagpur, Kolkata, Bengaluru (Bangalore), Chennai
4 - 6 yrs
₹4.5L - ₹18L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL

🚀 Hiring: Java Developer at Deqode

⭐ Experience: 4+ Years

📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


Requirements

✅ Strong proficiency in Java (Java 8/11/17)

✅ Experience with Spring / Spring Boot

✅ Knowledge of REST APIs, Microservices architecture

✅ Familiarity with SQL/NoSQL databases

✅ Understanding of Git, CI/CD pipelines

✅ Problem-solving skills and attention to detail


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Shivangi Bhattacharyya
Posted by Shivangi Bhattacharyya
Bengaluru (Bangalore)
6 - 10 yrs
Best in industry
skill iconPython
Generative AI
skill iconMachine Learning (ML)
SQL
Business Intelligence (BI)
+1 more

Job Description: 


Exp Range - [6y to 10y]


Qualifications:


  • Minimum Bachelors Degree in Engineering or Computer Applications or AI/Data science
  • Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
  • Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred


Require Skills: 

  • Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
  • Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
  • Must Have – Experience working with models such as Random forest, Kmeans clustering, BERT…
  • Should Have – Exposure to querying warehouses and APIs
  • Should Have – Experience with writing moderate to complex SQL queries
  • Should Have – Experience analyzing and presenting data with BI tools or Excel
  • Must Have – Very strong communication skills to work with technical and non technical stakeholders in a global environment

 

Roles and Responsibilities:

  • Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
  • Analyse and present insights about the data and processes to Business Stakeholders
  • Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
  • Develop and deploy customized models on Production data sets to generate analytical insights and predictions
  • Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
  • Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
  • Share knowledge and best practices with broader teams to make everyone aware and more productive.


Read more
Industry Automation

Industry Automation

Agency job
via Michael Page by Pramod P
Bengaluru (Bangalore)
5 - 9 yrs
₹20L - ₹30L / yr
skill iconC#
Microsoft Windows Azure
API
SQL
NOSQL Databases
+3 more

Your job: • Develop and maintain software components, including APIs and microservices

• Optimize backend systems on Microsoft Azure using App Services, Functions, and AzureSQL

• Contribute to frontend development as needed in a full-stack capacity

• Participate in code reviews, unit testing, and bug fixing to ensure high code quality

• Collaborate with the development team, product owner, and DevOps team in agile projects

• Maintain clear and comprehensive technical documentation for all feature and APIs


Your qualification:

• Master’s or bachelor’s degree in computer science

• 5 to 8yearsofexperienceinbackendwebapplicationdevelopment

• Expertise in backend technologies such as C#/.NET Core and in databases, including SQL and NoSQL (AzureSQL, Cosmos DB)

• Experience with Microsoft Azure services (App Services, Functions, SQL) and familiarity with frontend technologies (JavaScript/TypeScript and/ or Angular) would be an added advantage

• Proficiency in cloud-based backend development, full-stack development, and software optimization

• Experience with agile methodologies, unit testing, automated testing, and CI/CD pipelines would be beneficial

• Excellent written and spoken English communications kills

Read more
Aryush Infotech India Pvt Ltd
Nitin Gupta
Posted by Nitin Gupta
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹4L / yr
Fintech
Test Automation (QA)
Manual testing
skill iconPostman
JIRA
+5 more

Job Title: QA Tester – FinTech (Manual + Automation Testing)

Location: Bangalore, India

Job Type: Full-Time

Experience Required: 3 Years

Industry: FinTech / Financial Services

Function: Quality Assurance / Software Testing

 

About the Role:

We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.

 

Key Responsibilities:

  • Analyze business and functional requirements for financial products and translate them into test scenarios.
  • Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
  • Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
  • Conduct API testing using Postman, Rest Assured, or similar tools.
  • Perform functional, regression, integration, and system testing across web and mobile platforms.
  • Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
  • Log and track defects using JIRA or a similar defect management tool.
  • Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
  • Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
  • Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.

 

Required Skills and Experience:

  • 3+ years of hands-on experience in manual and automation testing.
  • Solid understanding of QA methodologies, STLC, and SDLC.
  • Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
  • Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
  • Knowledge of API testing, including RESTful services.
  • Familiarity with SQL to validate data in databases.
  • Understanding of CI/CD processes and basic scripting for automation integration.
  • Good problem-solving skills and attention to detail.
  • Excellent communication and documentation skills.

 

Preferred Qualifications:

  • Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
  • Experience with mobile app testing (iOS/Android).
  • Working knowledge of test management tools like TestRail, Zephyr, or Xray.
  • Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
  • Basic knowledge of version control systems (e.g., Git).


Read more
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data engineering
Data architecture
SQL
Data modeling
GCS
+47 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 5 yrs
₹4L - ₹8L / yr
skill iconPython
API
SQL

An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.

Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.

Key Responsibilities

  • Advanced Troubleshooting & Incident Management:
  • Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
  • Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
  • Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
  • Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
  • Python-Specific Tasks:
  • Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
  • Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
  • Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
  • Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
  • Collaboration and Escalation:
  • Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
  • Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
  • Documentation and Process Improvement:
  • Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
  • Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
  • Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
  • Customer Communication:
  • Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.

Required Technical Skills

  • Programming/Scripting:
  • Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
  • Experience with other scripting languages like Bash or Shell
  • Databases:
  • Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
  • Application/Web Technologies:
  • Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
  • Knowledge of application architectures (e.g., microservices, SOA) is a plus.
  • Monitoring & Tools:
  • Experience with support ticketing systems (e.g., JIRA, ServiceNow).
  • Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)


Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 4 yrs
₹4L - ₹10L / yr
skill iconPython
API
SQL

An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.

Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.

Key Responsibilities

  • Advanced Troubleshooting & Incident Management:
  • Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
  • Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
  • Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
  • Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
  • Python-Specific Tasks:
  • Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
  • Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
  • Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
  • Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
  • Collaboration and Escalation:
  • Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
  • Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
  • Documentation and Process Improvement:
  • Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
  • Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
  • Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
  • Customer Communication:
  • Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.

Required Technical Skills

  • Programming/Scripting:
  • Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
  • Experience with other scripting languages like Bash or Shell
  • Databases:
  • Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
  • Application/Web Technologies:
  • Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
  • Knowledge of application architectures (e.g., microservices, SOA) is a plus.
  • Monitoring & Tools:
  • Experience with support ticketing systems (e.g., JIRA, ServiceNow).
  • Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)


Read more
Bengaluru (Bangalore)
1 - 4 yrs
₹5L - ₹15L / yr
skill iconDjango
skill iconFlask
skill iconHTML/CSS
SQL

Job Responsibilities :


- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.


- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.


- Ability to take full ownership of the software development lifecycle from requirement to release.


- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.


- Embrace design and code reviews to deliver quality code.


- Play a key role in taking Trendlyne to the next level as a world-class engineering team


-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.


- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.


- Develop and maintain scalable, robust backend systems using Python and Django framework.


- Proficient understanding of the performance of web and mobile applications.


- Mentor junior developers and foster skill development within the team.


Job Requirements :


- 1+ years of experience with Python and Django.


- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.


- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS


Who are we :


Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.


Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.


What do we do :


We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.

Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
SQL

Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.

Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Read more
Euphoric Thought Technologies
Bengaluru (Bangalore)
8 - 12 yrs
₹18L - ₹25L / yr
Dot Net
Windows Azure
SQL
skill iconC#
Web api
+2 more

Skills required:

  • Strong expertise in .NET Core / ASP.NET MVC
  • Candidate must have 8+ years of experience in Dot Net.
  • Candidate must have experience with Angular.
  • Hands-on experience with Entity Framework & LINQ
  • Experience with SQL Server (performance tuning, stored procedures, indexing)
  • Understanding of multi-tenancy architecture
  • Experience with Microservices / API development (REST, GraphQL)
  • Hands-on experience in Azure Services (App Services, Azure SQL, Blob Storage, Key Vault, Functions, etc.)
  • Experience in CI/CD pipelines with Azure DevOps
  • Knowledge of security best practices in cloud-based applications
  • Familiarity with Agile/Scrum methodologies
  • Flexible to use copilot or any other AI tool to write automated test cases and faster code writing

Roles and Responsibilities:

- Good communication Skills is must.

- Develop features across multiple subsystems within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, and deployment.

- Understand how our applications operate, are structured, and how customers use them

- Provide engineering support (when necessary) to our technical operations staff when they are building, deploying, configuring, and supporting systems for customers.

Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹24L / yr
SaaS
Software implementation
Customer Success
Implementation
Tech Support
+8 more

Review Criteria

  • Strong Implementation Manager / Customer Success Implementation / Technical Solutions / Post-Sales SaaS Delivery
  • 3+ years of hands-on experience in software/tech Implementation roles within technical B2B SaaS companies, preferably working with global or US-based clients
  • Must have direct experience leading end-to-end SaaS product implementations — including onboarding, workflow configuration, API integrations, data setup, and customer training
  • Must have strong technical understanding — including ability to read and write basic SQL queries, debug API workflows, and interpret JSON payloads for troubleshooting or configuration validation.
  • Must have worked in post-sales environments, owning customer success and delivery after deal closure, ensuring product adoption, accurate setup, and smooth go-live.
  • Must have experience collaborating cross-functionally with product, engineering, and sales teams to ensure timely resolution of implementation blockers and seamless client onboarding.
  • (Company): B2B SaaS startup or growth-stage company
  • Mandatory (Note): Good growth opportunity, this role will have team leading option after a few months


Preferred

  • Preferred (Experience): Previous experience in FinTech SaaS like BillingTech, finance automation, or subscription management platforms will be a strong plus


Job Specific Criteria

  • CV Attachment is mandatory
  • Are you open to work in US timings (4/5:00 PM - 3:00 AM) - to target the US market?
  • Please provide CTC Breakup (Fixed + Variable)?
  • It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
  • It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?

 

Role & Responsibilities

As the new hire in this role, you'll be the voice of the customer in the company, and lead the charge in developing our customer-centric approach, working closely with our tech, design, and product teams.

 

What you will be doing:

You will be responsible for converting, onboarding, managing, and proactively ensuring success for our customers/prospective clients.

  • Implementation
  • Understand client billing models and configure company contracts, pricing, metering, and invoicing accurately.
  • Lead pilots and implementation for new customers, ensuring complete onboarding within 3–8 weeks.
  • Translate complex business requirements into structured company workflows and setup.
  • Pre-sales & Technical Discovery
  • Support sales with live demos, sandbox setups, and RFP responses.
  • Participate in technical discovery calls to map company capabilities to client needs.
  • Create and maintain demo environments showcasing relevant use cases.
  • Internal Coordination & Escalation
  • Act as the voice of the customer internally — share structured feedback with product and engineering.
  • Create clear, well-scoped handoff documents when working with technical teams.
  • Escalate time-sensitive issues appropriately and follow through on resolution.
  • Documentation & Enablement
  • Create client-specific documentation (e.g., onboarding guides, configuration references).
  • Contribute to internal wikis, training material, and product documentation.
  • Write simple, to-the-point communication — clear enough for a CXO and detailed enough for a developer.

 

Ideal Candidate

  • 3-7 years of relevant experience
  • Willing to work in US time zone (~430 am IST) on weekdays (Mon-Fri)
  • Ability to understand and shape the product at a granular level
  • Ability to empathize with the customers, and understand their pain points
  • Understanding of SaaS architecture and APIs conceptually — ability to debug API workflows and usage issues
  • Previous experience in salesforce CRM
  • Entrepreneurial drive, and willingness to wear multiple hats as per company’s requirements
  • Strong analytical skills and a structured problem-solving approach
  • (Strongly preferred) Computer science background and basic coding experience
  • Ability to understand functional aspects related to the product e.g., accounting/revenue recognition, receivables, billing etc
  • Self-motivated and proactive in managing tasks and responsibilities, requiring minimal follow-ups.
  • Self-driven individual with high ownership and strong work ethic
  • Not taking yourself too seriously.


Read more
LogIQ Labs Pvt.Ltd.
Bengaluru (Bangalore), Pune, Hyderabad, Noida
3 - 5 yrs
₹4L - ₹10L / yr
Playwright
SQL

Functional Testing & Validation

  • Web Application Testing: Design, document, and execute comprehensive functional test plans and test cases for complex, highly interactive web applications, ensuring they meet specified requirements and provide an excellent user experience.
  • Backend API Testing: Possess deep expertise in validating backend RESTful and/or SOAP APIs. This includes testing request/response payloads, status codes, data integrity, security, and robust error handling mechanisms.
  • Data Validation with SQL: Write and execute complex SQL queries (joins, aggregations, conditional logic) to perform backend data checks, verify application states, and ensure data integrity across integration points.
  • I Automation (Playwright & TypeScript):
  • Design, develop, and maintain robust, scalable, and reusable UI automation scripts using Playwright and TypeScript.
  • Integrate automation suites into Continuous Integration/Continuous Deployment (CI/CD) pipelines.
  • Implement advanced automation patterns and frameworks (e.g., Page Object Model) to enhance maintainability.
  • Prompt-Based Automation: Demonstrate familiarity or hands-on experience with emerging AI-driven or prompt-based automation approaches and tools to accelerate test case generation and execution.
  • API Automation: Develop and maintain automated test suites for APIs to ensure reliability and performance.

3. Performance & Load Testing

  • JMeter Proficiency: Utilize Apache JMeter to design, script, and execute robust API load testing and stress testing scenarios.
  • Analyse performance metrics, identify bottlenecks (e.g., response time, throughput), and provide actionable reports to development teams.


🛠️ Required Skills and Qualifications

  • Experience: 4+ years of professional experience in Quality Assurance and Software Testing, with a strong focus on automation.
  • Automation Stack: Expert-level proficiency in developing and maintaining automation scripts using Playwright and TypeScript.
  • Testing Tools: Proven experience with API testing tools (e.g., Postman, Swagger) and strong functional testing methodologies.
  • Database Skills: Highly proficient in writing and executing complex SQL queries for data validation and backend verification.
  • Performance: Hands-on experience with Apache JMeter for API performance and load testing.
  • Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams (Developers, Product Managers).
  • Problem-Solving: Strong analytical and debugging skills to efficiently isolate and report defects.


Read more
AryuPay Technologies
Bhavana Chaudhari
Posted by Bhavana Chaudhari
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹5L / yr
Search Engine Optimization (SEO)
SQL
On-page Optimization
off page seo
skill iconGoogle Analytics
+3 more

Job Description – SEO Specialist

Company: Capace Software Pvt. Ltd.

Location: Bhopal / Bangalore (On-site)

Experience: 2+ Years

Budget: Up to ₹4 LPA

Position: Full-Time


About the Role

Capace Software Pvt. Ltd. is looking for a skilled SEO Specialist with strong expertise in On-Page SEO, Off-Page SEO, and Technical SEO. The ideal candidate will be responsible for improving our search engine ranking, driving organic traffic, and ensuring technical search requirements are met across websites.


Key Responsibilities

🔹 On-Page SEO

  • Optimize meta titles, descriptions, header tags, and URLs
  • Conduct in-depth keyword research and implement strategic keyword placement
  • Optimize website content for relevancy and readability
  • Implement internal linking strategies
  • Optimize images, schema, and site structure for SEO
  • Ensure webpages follow SEO best practices

🔹 Off-Page SEO

  • Create and execute backlink strategies
  • Manage directory submissions, social bookmarking, classified listings
  • Conduct competitor backlink analysis
  • Build high-quality guest post links and outreach
  • Improve brand visibility through digital promotions


🔹 Technical SEO

  • Conduct website audits (crawl errors, index issues, technical fixes)
  • Optimize website speed and performance
  • Implement schema markup and structured data
  • Manage XML sitemaps and robots.txt
  • Resolve indexing, crawling, and canonical issues
  • Work with developers to implement technical updates


Requirements

  • Minimum 2+ years of experience in SEO
  • Strong knowledge of On-Page, Off-Page & Technical SEO
  • Experience with tools like:
  • Google Analytics
  • Google Search Console
  • Ahrefs / SEMrush / Ubersuggest
  • Screaming Frog (good to have)
  • Understanding of HTML, CSS basics (preferred)
  • Strong analytical and reporting skills
  • Good communication and documentation skills


What We Offer

  • Competitive salary up to ₹4 LPA
  • Opportunity to work on multiple SaaS products and websites
  • Supportive team & learning-focused environment
  • Career growth in digital marketing & SEO domain
Read more
Tarento Group

at Tarento Group

3 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
4yrs+
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Windows Azure
RESTful APIs
+5 more

Job Summary:

We are seeking a highly skilled and self-driven Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.Key Responsibilities:

  • Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
  • Implement and maintain RESTful APIs, ensuring high performance and scalability.
  • Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
  • Develop and manage Docker containers, enabling efficient development and deployment pipelines.
  • Integrate messaging services like Apache Kafka into microservice architectures.
  • Design and maintain data models using PostgreSQL or other SQL databases.
  • Implement unit testing using JUnit and mocking frameworks to ensure code quality.
  • Develop and execute API automation tests using Cucumber or similar tools.
  • Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
  • Work with Kubernetes for orchestrating containerized services.
  • Utilize Couchbase or similar NoSQL technologies when necessary.
  • Participate in code reviews, design discussions, and contribute to best practices and standards.

Required Skills & Qualifications:

  • Strong experience in Java (11 or above) and Spring Boot framework.
  • Solid understanding of microservices architecture and deployment on Azure.
  • Hands-on experience with Docker, and exposure to Kubernetes.
  • Proficiency in Kafka, with real-world project experience.
  • Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
  • Experience in writing unit tests using JUnit and mocking tools.
  • Experience with Cucumber or similar frameworks for API automation testing.
  • Exposure to CI/CD toolsDevOps processes, and Git-based workflows.

Nice to Have:

  • Azure certifications (e.g., Azure Developer Associate)
  • Familiarity with Couchbase or other NoSQL databases.
  • Familiarity with other cloud providers (AWS, GCP)
  • Knowledge of observability tools (Prometheus, Grafana, ELK)

Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent verbal and written communication.
  • Ability to work in an agile environment and contribute to continuous improvement.

Why Join Us:

  • Work on cutting-edge microservice architectures
  • Strong learning and development culture
  • Opportunity to innovate and influence technical decisions
  • Collaborative and inclusive work environment
Read more
Bengaluru (Bangalore)
6 - 10 yrs
₹15L - ₹28L / yr
Business Analysis
Data integration
SQL
PMS
CRS
+2 more

Job Description: Business Analyst – Data Integrations

Location: Bangalore / Hybrid / Remote

Company: LodgIQ

Industry: Hospitality / SaaS / Machine Learning

About LodgIQ

Headquartered in New York, LodgIQ delivers a revolutionary B2B SaaS platform to the

travel industry. By leveraging machine learning and artificial intelligence, we enable precise

forecasting and optimized pricing for hotel revenue management. Backed by Highgate

Ventures and Trilantic Capital Partners, LodgIQ is a well-funded, high-growth startup with a

global presence.

About the Role

We’re looking for a skilled Business Analyst – Data Integrations who can bridge the gap

between business operations and technology teams, ensuring smooth, efficient, and scalable

integrations. If you’re passionate about hospitality tech and enjoy solving complex data

challenges, we’d love to hear from you!

What You’ll Do

Key Responsibilities

 Collaborate with vendors to gather requirements for API development and ensure

technical feasibility.

 Collect API documentation from vendors; document and explain business logic to

use external data sources effectively.

 Access vendor applications to create and validate sample data; ensure the accuracy

and relevance of test datasets.

 Translate complex business logic into documentation for developers, ensuring

clarity for successful integration.

 Monitor all integration activities and support tickets in Jira, proactively resolving

critical issues.

 Lead QA testing for integrations, overseeing pilot onboarding and ensuring solution

viability before broader rollout.

 Document onboarding processes and best practices to streamline future

integrations and improve efficiency.

 Build, train, and deploy machine learning models for forecasting, pricing, and

optimization, supporting strategic goals.

 Drive end-to-end execution of data integration projects, including scoping, planning,

delivery, and stakeholder communication.

 Gather and translate business requirements into actionable technical specifications,

liaising with business and technical teams.


 Oversee maintenance and enhancement of existing integrations, performing RCA

and resolving integration-related issues.

 Document workflows, processes, and best practices for current and future

integration projects.

 Continuously monitor system performance and scalability, recommending

improvements to increase efficiency.

 Coordinate closely with Operations for onboarding and support, ensuring seamless

handover and issue resolution.

Desired Skills & Qualifications

 Strong experience in API integration, data analysis, and documentation.

 Familiarity with Jira for ticket management and project workflow.

 Hands-on experience with machine learning model development and deployment.

 Excellent communication skills for requirement gathering and stakeholder

engagement.

 Experience with QA test processes and pilot rollouts.

 Proficiency in project management, data workflow documentation, and system

monitoring.

 Ability to manage multiple integrations simultaneously and work cross-functionally.

Required Qualifications

 Experience: Minimum 4 years in hotel technology or business analytics, preferably

handling data integration or system interoperability projects.

 Technical Skills:

 Basic proficiency in SQL or database querying.

 Familiarity with data integration concepts such as APIs or ETL workflows

(preferred but not mandatory).

 Eagerness to learn and adapt to new tools, platforms, and technologies.

 Hotel Technology Expertise: Understanding of systems such as PMS, CRS, Channel

Managers, or RMS.

 Project Management: Strong organizational and multitasking abilities.

 Problem Solving: Analytical thinker capable of troubleshooting and driving resolution.


 Communication: Excellent written and verbal skills to bridge technical and non-

technical discussions.


 Attention to Detail: Methodical approach to documentation, testing, and deployment.

Preferred Qualification

 Exposure to debugging tools and troubleshooting methodologies.

 Familiarity with cloud environments (AWS).

 Understanding of data security and privacy considerations in the hospitality industry.

Why LodgIQ?

 Join a fast-growing, mission-driven company transforming the future of hospitality.


 Work on intellectually challenging problems at the intersection of machine learning,

decision science, and human behavior.

 Be part of a high-impact, collaborative team with the autonomy to drive initiatives from

ideation to production.

 Competitive salary and performance bonuses.

 For more information, visit https://www.lodgiq.com

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹28L / yr
databricks
skill iconPython
SQL
PySpark
skill iconAmazon Web Services (AWS)
+9 more

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.


Skill Examples:

  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning.
  6. Experience in data warehouse design and cost improvements.
  7. Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Communicate and explain design/development aspects to customers.
  9. Estimate time and resource requirements for developing/debugging features/components.
  10. Participate in RFP responses and solutioning.
  11. Mentor team members and guide them in relevant upskilling and certification.

 

Knowledge Examples:

  1. Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  2. Proficient in SQL for analytics and windowing functions.
  3. Understanding of data schemas and models.
  4. Familiarity with domain-related data.
  5. Knowledge of data warehouse optimization techniques.
  6. Understanding of data security concepts.
  7. Awareness of patterns frameworks and automation practices.


 

Additional Comments:

# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026

Project Overview:

Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.

The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.

Design, build, and maintain scalable data pipelines using Databricks and PySpark.

Develop and optimize complex SQL queries for data extraction, transformation, and analysis.

Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).

Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.

Ensure data quality, performance, and reliability across data workflows.

Participate in code reviews, data architecture discussions, and performance optimization initiatives.

Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.


Key Skills:

Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).

Excellent problem-solving, communication, and collaboration skills.

 

Skills: Databricks, Pyspark & Python, Sql, Aws Services

 

Must-Haves

Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)

Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).


******

Notice period - Immediate to 15 days

Location: Bangalore

Read more
Loyalytics

at Loyalytics

2 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 7 yrs
Upto ₹22L / yr (Varies
)
SQL
PowerBI
skill iconData Analytics
Customer Relationship Management (CRM)

In this role, you will drive and support customer analytics for HP’s online store business across the APJ region. You will lead campaign performance analytics, customer database intelligence, and enable data-driven targeting for automation and trigger programs. Your insights will directly shape customer engagement, marketing strategy, and business decision-making.


You will be part of the International Customer Management team, which focuses on customer strategy, base value, monetization, and brand consideration. As part of HP’s Digital Direct organization, you will support the company’s strategic transformation toward direct-to-customer excellence.


Join HP—a US$50B global technology leader known for innovation and being #1 in several business domains.


Key Responsibilities

Customer Insights & Analytics

  • Design and deploy customer success and engagement metrics across APJ.
  • Analyze customer behavior and engagement to drive data-backed marketing decisions.
  • Apply statistical techniques to translate raw data into meaningful insights.

Campaign Performance & Optimization

  • Elevate marketing campaigns across APJ by enabling advanced targeting criteria, performance monitoring, and test-and-learn frameworks.
  • Conduct campaign measurement, identifying trends, patterns, and optimization opportunities.

Data Management & Reporting

  • Develop a deep understanding of business data across markets.
  • Build and maintain SQL-based data assets: tables, stored procedures, scripts, queries, and SQL views.
  • Provide reporting and dashboards for marketing, sales, and CRM teams using Tableau or Power BI.
  • Measure and monitor strategic initiatives against KPIs and provide uplift forecasts for prioritization.

Required Experience

  • 4+ years of relevant experience (flexible for strong profiles).
  • Proficiency in SQL, including:
  • Database design principles
  • Query optimization
  • Data integrity checks
  • Building SQL views, stored procedures, and analytics-ready datasets
  • Experience translating analytics into business outcomes.
  • Hands-on experience analyzing campaign performance.
  • Expertise with data visualization tools such as Tableau or Power BI.
  • Experience with campaign management/marketing automation platforms (preferably Salesforce Marketing Cloud).

About You

  • Strong advocate of customer data–driven marketing.
  • Comfortable working hands-on with data and solving complex problems.
  • Confident communicator who can work with multiple cross-functional stakeholders.
  • Passionate about experimentation (test & learn) and continuous improvement.
  • Self-driven, accountable, and motivated by ownership.
  • Thrive in a diverse, international, dynamic environment.


Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Kochi (Cochin), Trivandrum, Hyderabad, Thiruvananthapuram
8 - 10 yrs
₹10L - ₹25L / yr
Business Analysis
Data Visualization
PowerBI
SQL
Tableau
+18 more

Job Description – Senior Technical Business Analyst

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

 

About the Role

We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.

As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.

 

Key Responsibilities

Business & Analytical Responsibilities

  • Partner with business teams to understand one-line problem statements and translate them into detailed business requirementsopportunities, and project scope.
  • Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
  • Create documentation including Business Requirement Documents (BRDs)user storiesprocess flows, and analytical models.
  • Break down business needs into concise, actionable, and development-ready user stories in Jira.

Data & Technical Responsibilities

  • Collaborate with data engineering teams to design, review, and validate data pipelinesdata models, and ETL/ELT workflows.
  • Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
  • Apply foundational data science concepts such as statistical analysispredictive modeling, and machine learning fundamentals.
  • Validate and ensure data quality, consistency, and accuracy across datasets and systems.

Collaboration & Execution

  • Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
  • Assist in development, testing, and rollout of data-driven solutions.
  • Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.

 

Required Skillsets

Core Technical Skills

  • 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
  • Data Analytics: SQL, descriptive analytics, business problem framing.
  • Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
  • Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
  • Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.

 

Soft Skills

  • Strong analytical thinking and structured problem-solving capability.
  • Ability to convert business problems into clear technical requirements.
  • Excellent communication, documentation, and presentation skills.
  • High curiosity, adaptability, and eagerness to learn new tools and techniques.

 

Educational Qualifications

  • BE/B.Tech or equivalent in:
  • Computer Science / IT
  • Data Science

 

What We Look For

  • Demonstrated passion for data and analytics through projects and certifications.
  • Strong commitment to continuous learning and innovation.
  • Ability to work both independently and in collaborative team environments.
  • Passion for solving business problems using data-driven approaches.
  • Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.

 

Why Join Us?

  • Exposure to modern data platforms, analytics tools, and AI technologies.
  • A culture that promotes innovation, ownership, and continuous learning.
  • Supportive environment to build a strong career in data and analytics.

 

Skills: Data Analytics, Business Analysis, Sql


Must-Haves

Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R

 

******

Notice period - 0 to 15 days (Max 30 Days)

Educational Qualifications: BE/B.Tech or equivalent in: (Computer Science / IT) /Data Science

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

Read more
Albert Invent

at Albert Invent

4 candid answers
3 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹30L / yr (Varies
)
skill iconPython
AWS Lambda
Amazon Redshift
Snow flake schema
SQL

To design, build, and optimize scalable data infrastructure and pipelines that enable efficient

data collection, transformation, and analysis across the organization. The Senior Data Engineer

will play a key role in driving data architecture decisions, ensuring data quality and availability,

and empowering analytics, product, and engineering teams with reliable, well-structured data to

support business growth and strategic decision-making.


Responsibilities:

• Develop, and maintain SQL and NoSQL databases, ensuring high performance,

scalability, and reliability.

• Collaborate with the API team and Data Science team to build robust data pipelines and

automations.

• Work closely with stakeholders to understand database requirements and provide

technical solutions.

• Optimize database queries and performance tuning to enhance overall system

efficiency.

• Implement and maintain data security measures, including access controls and

encryption.

• Monitor database systems and troubleshoot issues proactively to ensure uninterrupted

service.

• Develop and enforce data quality standards and processes to maintain data integrity.

• Create and maintain documentation for database architecture, processes, and

procedures.

• Stay updated with the latest database technologies and best practices to drive

continuous improvement.

• Expertise in SQL queries and stored procedures, with the ability to optimize and fine-tune

complex queries for performance and efficiency.

• Experience with monitoring and visualization tools such as Grafana to monitor database

performance and health.


Requirements:

• 4+ years of experience in data engineering, with a focus on large-scale data systems.

• Proven experience designing data models and access patterns across SQL and NoSQL

ecosystems.

• Hands-on experience with technologies like PostgreSQL, DynamoDB, S3, GraphQL, or

vector databases.

• Proficient in SQL stored procedures with extensive expertise in MySQL schema design,

query optimization, and resolvers, along with hands-on experience in building and

maintaining data warehouses.

• Strong programming skills in Python or JavaScript, with the ability to write efficient,

maintainable code.

• Familiarity with distributed systems, data partitioning, and consistency models.

• Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry) and

debugging production bottlenecks.

• Deep understanding of cloud infrastructure (preferably AWS), including networking, IAM,

and cost optimization.

• Prior experience building multi-tenant systems with strict performance and isolation

guarantees.

• Excellent communication and collaboration skills to influence cross-functional technical

decisions.

Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Delhi
3 - 6 yrs
₹10L - ₹25L / yr
Project Management
SQL
JIRA
SQL Query Analyzer
confluence
+23 more

Required Skills: Excellent Communication Skills, Project Management, SQL queries, Expertise with Tools such as Jira, Confluence etc.


Criteria:

  • Candidate must have Project management experience.
  • Candidate must have strong experience in accounting principles, financial workflows, and R2R (Record to Report) processes.
  • Candidate should have an academic background in Commerce or MBA Finance.
  • Candidates must be from a Fintech/ Financial service only.
  • Good experience with SQL and must have MIS experience.
  • Must have experience in Treasury Module.
  • 3+ years of implementation experience is required.
  • Candidate should have Hands-on experience with tools such as Jira, Confluence, Excel, and project management platforms.
  • Need candidate from Bangalore and Delhi/NCR ONLY.
  • Need Immediate joiner or candidate with up to 30 Days’ Notice period.

 

Description

Position Overview

We are looking for an experienced Implementation Lead with deep expertise in financial workflows, R2R processes, and treasury operations to drive client onboarding and end-to-end implementations. The ideal candidate will bring a strong Commerce / MBA Finance background, proven project management experience, and technical skills in SQL and ETL to ensure seamless deployments for fintech and financial services clients.


Key Responsibilities

  • Lead end-to-end implementation projects for enterprise fintech clients
  • Translate client requirements into detailed implementation plans and configure solutions accordingly.
  • Write and optimize complex SQL queries for data analysis, validation, and integration
  • Oversee ETL processes – extract, transform, and load financial data across systems
  • Collaborate with cross-functional teams including Product, Engineering, and Support
  • Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
  • Document processes, client requirements, and integration flows in detail.
  • Configure and deploy company solutions for R2R, treasury, and reporting workflows.


Required Qualifications

  • Bachelor’s degree Commerce background / MBA Finance (mandatory).
  • 3+ years of hands-on implementation/project management experience
  • Proven experience delivering projects in Fintech, SaaS, or ERP environments
  • Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
  • Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
  • Experience working with ETL pipelines or data migration processes
  • Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
  • Strong communication and stakeholder management skills
  • Ability to manage multiple projects simultaneously and drive client success


Qualifications

  • Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
  • Familiarity with API integrations and basic data mapping
  • Experience in agile/scrum-based implementation environments
  • Exposure to reconciliation, book closure, AR/AP, and reporting systems
  • PMP, CSM, or similar certifications



Skills & Competencies

Functional Skills

  • Financial process knowledge (e.g., reconciliation, accounting, reporting)
  • Business analysis and solutioning
  • Client onboarding and training
  • UAT coordination
  • Documentation and SOP creation

 

Project Skills

  • Project planning and risk management
  • Task prioritization and resource coordination
  • KPI tracking and stakeholder reporting

 

Soft Skills

  • Cross-functional collaboration
  • Communication with technical and non-technical teams
  • Attention to detail and customer empathy
  • Conflict resolution and crisis management


What We Offer

  • An opportunity to shape fintech implementations across fast-growing companies
  • Work in a dynamic environment with cross-functional experts
  • Competitive compensation and rapid career growth
  • A collaborative and meritocratic culture
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Swet Patel
Posted by Swet Patel
Bengaluru (Bangalore)
5 - 13 yrs
Best in industry
databricks
skill iconPython
SQL
PySpark
Spark

Key Responsibilities

We are seeking an experienced Data Engineer with a strong background in Databricks, Python, Spark/PySpark and SQL to design, develop, and optimize large-scale data processing applications. The ideal candidate will build scalable, high-performance data engineering solutions and ensure seamless data flow across cloud and on-premise platforms.

Key Responsibilities:

  • Design, develop, and maintain scalable data processing applications using DatabricksPython, and PySpark/Spark.
  • Write and optimize complex SQL queries for data extraction, transformation, and analysis.
  • Collaborate with data engineers, data scientists, and other stakeholders to understand business requirements and deliver high-quality solutions.
  • Ensure data integrity, performance, and reliability across all data processing pipelines.
  • Perform data analysis and implement data validation to ensure high data quality.
  • Implement and manage CI/CD pipelines for automated testing, integration, and deployment.
  • Contribute to continuous improvement of data engineering processes and tools.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Proven experience as a Databricks with strong expertise in Python, SQL and Spark/PySpark.
  • Strong proficiency in SQL, including working with relational databases and writing optimized queries.
  • Solid programming experience in Python, including data processing and automation.


Read more
Financial Services

Financial Services

Agency job
via Jobdost by Saida Pathan
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 6 yrs
₹20L - ₹25L / yr
Project Management
SQL
JIRA
confluence

Position Overview

We are looking for an experienced Implementation Lead with deep expertise in financial workflows, R2R processes, and treasury operations to drive client onboarding and end-to-end implementations. The ideal candidate will bring a strong Commerce / MBA Finance background, proven project management experience, and technical skills in SQL and ETL to ensure seamless deployments for fintech and financial services clients.


Key Responsibilities

  • Lead end-to-end implementation projects for enterprise fintech clients
  • Translate client requirements into detailed implementation plans and configure solutions accordingly.
  • Write and optimize complex SQL queries for data analysis, validation, and integration
  • Oversee ETL processes – extract, transform, and load financial data across systems
  • Collaborate with cross-functional teams including Product, Engineering, and Support
  • Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
  • Document processes, client requirements, and integration flows in detail.
  • Configure and deploy Bluecopa solutions for R2R, treasury, and reporting workflows.


Required Qualifications

  • Bachelor’s degree Commerce background / MBA Finance (mandatory).
  • 3+ years of hands-on implementation/project management experience
  • Proven experience delivering projects in Fintech, SaaS, or ERP environments
  • Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
  • Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
  • Experience working with ETL pipelines or data migration processes
  • Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
  • Strong communication and stakeholder management skills
  • Ability to manage multiple projects simultaneously and drive client success

Preferred Qualifications

  • Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
  • Familiarity with API integrations and basic data mapping
  • Experience in agile/scrum-based implementation environments
  • Exposure to reconciliation, book closure, AR/AP, and reporting systems
  • PMP, CSM, or similar certifications


Skills & Competencies

Functional Skills

  • Financial process knowledge (e.g., reconciliation, accounting, reporting)
  • Business analysis and solutioning
  • Client onboarding and training
  • UAT coordination
  • Documentation and SOP creation

Project Skills

  • Project planning and risk management
  • Task prioritization and resource coordination
  • KPI tracking and stakeholder reporting

Soft Skills

  • Cross-functional collaboration
  • Communication with technical and non-technical teams
  • Attention to detail and customer empathy
  • Conflict resolution and crisis management


What We Offer

  • An opportunity to shape fintech implementations across fast-growing companies
  • Work in a dynamic environment with cross-functional experts
  • Competitive compensation and rapid career growth
  • A collaborative and meritocratic culture


Read more
Capace Software Private Limited
Bhopal, Bengaluru (Bangalore)
7 - 13 yrs
₹9L - ₹12L / yr
Android
skill iconAndroid Development
frontend
Backend testing
fintech
+16 more

Job Description -Technical Project Manager

Job Title: Technical Project Manager

Location: Bhopal / Bangalore (On-site)

Experience Required: 7+ Years

Industry: Fintech / SaaS / Software Development

Role Overview

We are looking for a Technical Project Manager (TPM) who can bridge the gap between management and developers. The TPM will manage Android, Frontend, and Backend teams, ensure smooth development processes, track progress, evaluate output quality, resolve technical issues, and deliver timely reports.

Key Responsibilities

Project & Team Management

  • Manage daily tasks for Android, Frontend, and Backend developers
  • Conduct daily stand-ups, weekly planning, and reviews
  • Track progress, identify blockers, and ensure timely delivery
  • Maintain sprint boards, task estimations, and timelines

Technical Requirement Translation

  • Convert business requirements into technical tasks
  • Communicate requirements clearly to developers
  • Create user stories, flow diagrams, and PRDs
  • Ensure requirements are understood and implemented correctly

Quality & Build Review

  • Validate build quality, UI/UX flow, functionality
  • Check API integrations, errors, performance issues
  • Ensure coding practices and architecture guidelines are followed
  • Perform preliminary QA before handover to testing or clients

Issue Resolution

  • Identify development issues early
  • Coordinate with developers to fix bugs
  • Escalate major issues to founders with clear insights

Reporting & Documentation

  • Daily/weekly reports to management
  • Sprint documentation, release notes
  • Maintain project documentation & version control processes

Cross-Team Communication

  • Act as the single point of contact for management
  • Align multiple tech teams with business goals
  • Coordinate with HR and operations for resource planning

Required Skills

  • Strong understanding of Android, Web (Frontend/React), Backend development flows
  • Knowledge of APIs, Git, CI/CD, basic testing
  • Experience with Agile/Scrum methodologies
  • Ability to review builds and suggest improvements
  • Strong documentation skills (Jira, Notion, Trello, Asana)
  • Excellent communication & leadership
  • Ability to handle pressure and multiple projects

Good to Have

  • Prior experience in Fintech projects
  • Basic knowledge of UI/UX
  • Experience in preparing FSD/BRD/PRD
  • QA experience or understanding of test cases

Salary Range: 9 to 12 LPA

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mangalore, Pune, Mumbai
3 - 5 yrs
₹8L - ₹11L / yr
skill iconData Analytics
SQL
Relational Database (RDBMS)
skill iconJava
skill iconPython
+1 more

Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills

 

Review Criteria

· Must have Strong SQL skills (queries, optimization, procedures, triggers)

· Must have Advanced Excel skills

· Should have 3+ years of relevant experience

· Should have Reporting + dashboard creation experience

· Should have Database development & maintenance experience

· Must have Strong communication for client interactions

· Should have Ability to work independently

· Willingness to work from client locations.

 

Description

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?

As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?

For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations

 

Read more
Intellipro
Arthy R
Posted by Arthy R
Bengaluru (Bangalore), Chennai
3 - 7 yrs
₹10L - ₹18L / yr
Delphi
SQL

📢 Hiring: Delphi Developer – 6 Months Contract

Locations: Chennai & Bangalore | Immediate Joiners | Service-Based Project

We are hiring experienced Delphi Developers for a 6-month contractual role with a reputed service-based IT organization. Candidates with strong Delphi expertise who can contribute independently in a fast-paced environment are encouraged to apply.


🔧 Key Highlights

3–7 years of experience in software development

Strong hands-on experience in Delphi

Proficiency in SQL, ADO, and understanding of OOP, data structures, and design patterns

Exposure to JavaScript frameworks (Knockout/Angular) and modern UI concepts

Good communication, analytical, and problem-solving skills

Ability to work independently and multitask effectively

Preferred: Experience in Payments, Retail, EMV, C-Store, or Logistics domains


📍 Locations: Chennai & Bangalore

⏳ Contract Duration: 6 Months

🚀 Start Date: Immediate


Read more
Ladera Technology
Bengaluru (Bangalore)
7 - 11 yrs
₹10L - ₹22L / yr
skill iconJava
skill iconSpring Boot
Spring Security
APM
AWS Lambda
+9 more

Job Title: Software Developer (7-10 Years Experience)


Job Summary: We are seeking an experienced Software Developer with 7-10 years of hands-on development expertise in designing, building, and maintaining enterprise level applications and scalable APIs. Key


Responsibilities:

• Design, develop, and maintain microservices based applications using the Spring framework.

• Build secure, scalable REST and SOAP web services.

• Implement API security protocols including OAuth, JWT, SSL/TLS, X.509 certificates, and SAML, mTLS.

• Develop and deploy applications by leveraging AWS services such as EC2, Lambda, API Gateway, SQS, S3, SNS.

• Work with Azure cloud services and OpenShift for deployment and orchestration.

• Integrate JMS/messaging systems and work with middleware technologies such as MQ.

• Utilize SQL and NoSQL databases, including MySQL, PostgreSQL, and DynamoDB.

• Work with Netflix Conductor or Zuul for orchestration and routing.

• Collaborate with cross functional teams to deliver robust solutions in an Agile setup.


Required Skills:

• Strong JAVA OOPS fundamentals.

• Strong proficiency in Spring Framework (Spring Boot, Spring Cloud, Spring Security).

• Solid experience in microservices architecture.

• Handson experience with AWS cloud and OpenShift ecosystem.

• Familiarity with Azure services.

• Strong understanding of API security mechanisms.

• Expertise in building RESTful APIs.

• Experience working with SQL/NoSQL databases.

• Should have worked on integration with AppDynamics or similar APM tools

• Strong analytical and problem-solving skills.

Good to have skills:

• SOAP web services and graphQL

• Experience with JMS, messaging middleware, and MQ.


Qualifications:

• Bachelor’s or Master's degree in computer science or related field.

• 7-10 years of experience in backend development or full Stack development roles. 

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Kochi (Cochin), Mumbai, Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹17L / yr
Project Management
skill iconData Analytics
Program Management
SQL
Client Management
+7 more

Required Skills: Project Management, Data Analysis, SQL queries, Client Engagement

 

Criteria:

  • Must have 3+ years of project/program management experience in Financial Services/Banking/NBFC/Fintech companies only.
  • Hands-on proficiency in data analysis and SQL querying, with ability to work on large datasets
  • Ability to lead end-to-end implementation projects and manage cross-functional teams effectively.
  • Experience in process analysis, optimization, and mapping for operational efficiency.
  • Strong client-facing communication and stakeholder management capabilities.
  • Good expertise in financial operations processes and workflows with proven implementation experience.

 

Description

Position Overview:

We are seeking a dynamic and experienced Technical Program Manager to join our team. The successful candidate will be responsible for managing the implementation of company’s solutions at existing and new clients. This role requires a deep understanding of financial operation processes, exceptional problem-solving skills, and the ability to analyze large volumes of data. The Technical Program manager will drive process excellence and ensure outstanding customer satisfaction throughout the implementation lifecycle and beyond.

 

Key Responsibilities:

● Client Engagement: Serve as the primary point of contact for assigned clients, understanding their unique operation processes and requirements. Build and maintain strong relationships to facilitate successful implementations.

● Project Management: Lead the end-to-end implementation of company’s solutions, ensuring projects are delivered on time, within scope, and within budget. Coordinate with cross-functional teams to align resources and objectives.

● Process Analysis and Improvement: Evaluate clients' existing operation workflows, identify inefficiencies, and recommend optimized processes leveraging company’s platform. Utilize process mapping and data analysis to drive continuous improvement.

● Data Analysis: Analyze substantial datasets to ensure accurate configuration and integration of company’s solutions. Employ statistical tools and SQL-based queries to interpret data and provide actionable insights.

● Problem Solving: Break down complex problems into manageable components, developing effective solutions in collaboration with clients and internal teams.

● Process Excellence: Advocate for and implement best practices in process management, utilizing methodologies such as Lean Six Sigma to enhance operational efficiency.

● Customer Excellence: Ensure a superior customer experience by proactively addressing client needs, providing training and support, and promptly resolving any issues that arise.

 

Qualifications:

● Minimum of 3 years of experience in project management, preferably in financial services, software implementation, consulting or analytics.

● Strong analytical skills with experience in data analysis, SQL querying, and handling large datasets.

● Excellent communication and interpersonal skills, with the ability to manage client relationships effectively.

● Demonstrated ability to lead cross-functional teams and manage multiple projects concurrently.

● Proven expertise in financial operation processes and related software solutions is a plus

● Proficiency in developing business intelligence solutions or with low-code tools is a plus

 

Why Join company?

● Opportunity to work with a cutting-edge financial technology company.

● Collaborative and innovative work environment.

● Competitive compensation and benefits package.

● Professional development and growth opportunities.

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Mumbai, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Indore, Bengaluru (Bangalore)
4 - 7 yrs
₹4L - ₹10L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
Hibernate (Java)

Job Description

Role: Java Developer

Location: PAN India

Experience:4+ Years

Required Skills -

  1. 3+ years Java development experience
  2. Spring Boot framework expertise (MANDATORY)
  3. Microservices architecture design & implementation (MANDATORY)
  4. Hibernate/JPA for database operations (MANDATORY)
  5. RESTful API development (MANDATORY)
  6. Database design and optimization (MANDATORY)
  7. Container technologies (Docker/Kubernetes)
  8. Cloud platforms experience (AWS/Azure)
  9. CI/CD pipeline implementation
  10. Code review and quality assurance
  11. Problem-solving and debugging skills
  12. Agile/Scrum methodology
  13. Version control systems (Git)


Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹40L / yr
Oracle
Oracle Data Integrator
Oracle ERP
Implementation
Process automation
+30 more

Review Criteria

  • Strong Oracle Integration Cloud (OIC) Implementation profile
  • 5+ years in enterprise integration / middleware roles, with minimum 3+ years of hands-on Oracle Integration Cloud (OIC) implementation experience
  • Strong experience designing and delivering integrations using OIC Integrations, Adapters (File, FTP, DB, SOAP/REST, Oracle ERP), Orchestrations, Mappings, Process Automation, Visual Builder (VBCS), and OIC Insight/Monitoring
  • Proven experience building integrations across Oracle Fusion/ERP/HCM, Salesforce, on-prem systems (AS/400, JDE), APIs, file feeds (FBDI/HDL), databases, and third-party SaaS.
  • Strong expertise in REST/JSON, SOAP/XML, WSDL, XSD, XPath, XSLT, JSON Schema, and web-service–based integrations
  • Good working knowledge of OCI components (API Gateway, Vault, Autonomous DB) and hybrid integration patterns
  • Strong SQL & PL/SQL skills for debugging, data manipulation, and integration troubleshooting
  • Hands-on experience owning end-to-end integration delivery including architecture reviews, deployments, versioning, CI/CD of OIC artifacts, automated testing, environment migrations (Dev→Test→Prod), integration governance, reusable patterns, error-handling frameworks, and observability using OIC/OCI monitoring & logging tools
  • Experience providing technical leadership, reviewing integration designs/code, and mentoring integration developers; must be comfortable driving RCA, performance tuning, and production issue resolution
  • Strong stakeholder management, communication (written + verbal), problem-solving, and ability to collaborate with business/product/architect teams

 

Preferred

  • Preferred (Certification) – Oracle OIC or Oracle Cloud certification
  • Preferred (Domain Exposure) – Experience with Oracle Fusion functional modules (Finance, SCM, HCM), business events/REST APIs, SOA/OSB background, or multi-tenant/API-governed integration environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Oracle Integration Cloud (OIC)?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

Company is seeking an experienced OIC Lead to own the design, development and deployment of enterprise integrations. The ideal candidate will have atleast 6+years of prior experience in various integration technologies, with a good experience implementing OIC integration capabilities. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.

 

Responsibilities:

  • Lead the design and delivery of integration solutions using Oracle Integration Cloud (Integration, Process Automation, Visual Builder, Insight) and related Oracle PaaS components.
  • Build and maintain integrations between Oracle Fusion/ERP/HCM, Salesforce, on-prem applications (e.g., AS/400, JDE), APIs, file feeds (FBDI/HDL), databases and third-party SaaS.
  • Own end-to-end integration delivery - from architecture/design reviews through deployment, monitoring, and post-production support.
  • Create reusable integration patterns, error-handling frameworks, security patterns (OAuth2, client credentials), and governance for APIs and integrations.
  • Own CI/CD, versioning and migration of OIC artifacts across environments (Dev → Test → Prod); implement automated tests and promotion pipelines.
  • Define integration architecture standards and reference patterns for hybrid (cloud/on-prem) deployments.
  • Ensure security, scalability, and fault tolerance are built into all integration designs.
  • Drive performance tuning, monitoring and incident response for integrations; implement observability using OIC/OCI monitoring and logging tools.
  • Provide technical leadership and mentorship to a team of integration developers; review designs and code; run hands-on troubleshooting and production support rotations.
  • Work with business stakeholders, product owners and solution architects to translate requirements into integration designs, data mappings and runbooks

 

Ideal Candidate

  • 5+ years in integration/enterprise middleware roles with at least 3+ years hands-on OIC (Oracle Integration Cloud) implementations.
  • Strong experience with OIC components: Integrations, Adapters (File, FTP, Database, SOAP, REST, Oracle ERP), Orchestrations/Maps, OIC Insight/Monitoring, Visual Builder (VBCS) or similar
  • Expert in web services and message formats: REST/JSON, SOAP/XML, WSDL, XSD, XPath, XSLT, JSON Schema
  • Good knowledge of Oracle Cloud stack / OCI (API Gateway, Vault, Autonomous DB) and on-prem integration patterns
  • SQL & PL/SQL skills for data manipulation and troubleshooting; exposure to FBDI/HDL (for bulk loads) is desirable
  • Strong problem-solving, stakeholder management, written/verbal communication and team mentoring experience

 

Nice-to-have / Preferred:

  • Oracle OIC certification(s) or Oracle Cloud certifications
  • Exposure to OCI services (API Gateway, Vault, Monitoring) and Autonomous Database
  • Experience with Oracle Fusion functional areas (Finance, Supply Chain, HCM) and business events/REST APIs preferred.
  • Background with SOA Suite/Oracle Service Bus (useful if migrating legacy SOA to OIC)
  • Experience designing multi-tenant integrations, rate limiting/throttling and API monetization strategies.


Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data architecture
Data engineering
SQL
Data modeling
GCS
+21 more

Review Criteria

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred

  • Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Dremio?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate

  • Bachelor’s or master’s in computer science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.
Read more
Banking Industry

Banking Industry

Agency job
via Jobdost by Saida Pathan
Mangalore, Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹10L / yr
SQL
Dashboard
skill iconData Analytics
Database Development

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?


As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?


For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations


Read more
ONEPOS RETAIL SOLUTIONS PVT LTD
Bengaluru (Bangalore)
8 - 12 yrs
₹15L - ₹18L / yr
POS
Payment gateways
Selenium
JIRA
API
+6 more

Role Overview


The Automation Lead for the Point of Sale (POS) business is responsible for driving end-to-end automation strategy, framework development, and quality governance across POS applications, devices, and integrations. This role ensures high-quality releases by designing scalable automation solutions tailored to payment systems, in-store hardware, peripherals, and complex retail workflows.


You will lead a team of automation engineers, collaborate closely with product, development, and operations teams, and play a key role in accelerating delivery through optimized test coverage and robust automation pipelines.


Key Responsibilities


1. Automation Strategy & Leadership

•          Define and own the automation roadmap for POS systems (frontend UI, backend services, device interactions).

•          Lead, mentor, and upskill a team of automation engineers.

•          Establish automation KPIs (coverage, stability, execution time) and ensure continuous improvement.

•          Identify opportunities to improve automation maturity across the POS ecosystem.


2. Framework Architecture & Development

•          Design and build scalable, reusable automation frameworks for web, mobile ( IOS & Android), and device-level POS testing.

•          Integrate automation with CI/CD pipelines (Jenkins, GitHub Actions, Azure DevOps, etc.).

•          Implement best practices in coding standards, version control, and documentation.

•          Ensure automation solutions support multi-platform POS devices (payment terminals, printers, scanners, cash drawers, tablets).


3. Functional & Non-Functional Test Automation

•          Automate regression, smoke, and integration test suites for POS workflows (transactions, refunds, offline mode, sync, etc.).

•          Collaborate with performance and security teams to enable load, stress, and penetration testing automation.

•          Drive automation for API, UI, database, and hardware integration layers.


4. Quality Governance & Cross-Functional Collaboration

•          Work closely with product owners, business analysts, and developers to understand POS requirements.

•          Define test strategy, test plans, and automation coverage for each release.

•          Advocate for early testing, shift-left practices, and robust quality gates.

•          Manage defect triage and root cause analysis for automation-related issues.


5. POS Hardware & Integration Expertise

•          Ensure validation of POS peripherals (MSR,NCR, Verifone,  barcode scanners, EMV payment terminals, printers).

•          Support automation for cloud-hosted and on-prem POS systems.

•          Collaborate with vendors on device certifications and compliance (PCI, EMV, L3, etc.).


Required Skills & Experience


Technical Skills

•          Strong experience in automation tools/frameworks:

•          Selenium, Appium, Playwright, Cypress, TestNG, Junit or similar

•          REST API automation (Postman/Newman, RestAssured, Karate, Swagger, etc.)

•          Python/Java/JavaScript/C# for automation scripting

•          Experience in retail/POS/fintech/payment systems.

•          Experience with CI/CD tools and version control (Git).

•          Knowledge of POS hardware and device interaction automation.

•          Good understanding of microservices architecture and system integrations.

•          Experience working with SQL for data validation and backend testing.

•          Experience with bug tracking tools like JIRA , Azure Devops.


Leadership & Soft Skills

•          8–12 years overall experience, with at least 1- 2 years in a lead or senior automation role.

•          Ability to lead distributed teams.

•          Strong problem-solving, debugging, and analytical skills.

•          Excellent communication and stakeholder management.

•          Ability to work in a fast-paced, release-driven retail environment.

Preferred Qualifications

•          Experience in cloud-based POS platforms (AWS/Azure/GCP).

•          Exposure to payment certification testing (EMV L2/L3, PCI).

•          Knowledge of performance testing tools (JMeter, k6).

•          Experience with containerization (Docker, Kubernetes).

•          ISTQB, CSTE or other QA/Automation certifications.


What You Will Drive

•          Faster releases through automation-first delivery.

•          Improved POS reliability across devices and store environments.

•          Highly stable regression suites enabling continuous deployment.

•          A culture of quality across the POS engineering organization.


Why Join Us?

  • Work on industry-leading POS and payment systems.
  • Collaborative, inclusive, and innovative team culture.
  • Competitive compensation and benefits package.
  • Opportunities for growth and learning in a dynamic environment.


Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Bengaluru (Bangalore), Hyderabad, Mumbai, Gurugram
5 - 10 yrs
₹10L - ₹40L / yr
skill iconPython
SQL
Google Cloud Platform (GCP)
Dataform

Responsibilities:

  • Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
  • Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
  • Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
  • Implement SQL-based transformations using Dataform (or dbt)
  • Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
  • Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
  • Partner with solution architects and product teams to translate data requirements into technical designs
  • Mentor junior data engineers and support knowledge-sharing across the team
  • Contribute to documentation, code reviews, sprint planning, and agile ceremonies



Requirements


  • 5+ years of hands-on experience in data engineering, with at least 2 years on GCP
  • Proven expertise in BigQueryDataflow (Apache Beam)Cloud Composer (Airflow)
  • Strong programming skills in Python and/or Java
  • Experience with SQL optimizationdata modeling, and pipeline orchestration
  • Familiarity with GitCI/CD pipelines, and data quality monitoring frameworks
  • Exposure to Dataformdbt, or similar tools for ELT workflows
  • Solid understanding of data architectureschema design, and performance tuning
  • Excellent problem-solving and collaboration skills

Bonus Skills:

  • GCP Professional Data Engineer certification
  • Experience with Vertex AICloud FunctionsDataproc, or real-time streaming architectures
  • Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
  • Exposure to Docker/KubernetesAPI integration, and infrastructure-as-code (Terraform)


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Gurugram, Bhopal, Jaipur, Bengaluru (Bangalore)
2 - 4 yrs
₹5L - ₹12L / yr
Windows Azure
SQL
Data Structures
databricks

 Hiring: Azure Data Engineer

⭐ Experience: 2+ Years

📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

Passport: Mandatory & Valid

(Only immediate joiners & candidates serving notice period)


Mandatory Skills:

Azure Synapse, Azure Databricks, Azure Data Factory (ADF), SQL, Delta Lake, ADLS, ETL/ELT,Pyspark .


Responsibilities:

  • Build and maintain data pipelines using ADF, Databricks, and Synapse.
  • Develop ETL/ELT workflows and optimize SQL queries.
  • Implement Delta Lake for scalable lakehouse architecture.
  • Create Synapse data models and Spark/Databricks notebooks.
  • Ensure data quality, performance, and security.
  • Collaborate with cross-functional teams on data requirements.


Nice to Have:

Azure DevOps, Python, Streaming (Event Hub/Kafka), Power BI, Azure certifications (DP-203).


Read more
lulu international

lulu international

Agency job
via Episeio Business Solutions by Praveen Saulam
Bengaluru (Bangalore)
2.5 - 3 yrs
₹7L - ₹9L / yr
SQL
PySpark
databricks
Hypothesis testing
ANOVA gauge R&R

Role Overview

As a Lead Data Scientist / Data Analyst, you’ll combine analytical thinking, business acumen, and technical expertise to design and deliver impactful data-driven solutions. You’ll lead analytical problem-solving for retail clients — from data exploration and visualisation to predictive modelling and actionable business insights.

 

Key Responsibilities

  • Partner with business stakeholders to understand problems and translate them into analytical solutions.
  • Lead end-to-end analytics projects — from hypothesis framing and data wrangling to insight delivery and model implementation.
  • Drive exploratory data analysis (EDA), identify patterns/trends, and derive meaningful business stories from data.
  • Design and implement statistical and machine learning models (e.g., segmentation, propensity, CLTV, price/promo optimisation).
  • Build and automate dashboards, KPI frameworks, and reports for ongoing business monitoring.
  • Collaborate with data engineering and product teams to deploy solutions in production environments.
  • Present complex analyses in a clear, business-oriented way, influencing decision-making across retail categories.
  • Promote an agile, experiment-driven approach to analytics delivery.

 

Common Use Cases You’ll Work On

  • Customer segmentation (RFM, mission-based, behavioural)
  • Price and promo effectiveness
  • Assortment and space optimisation
  • CLTV and churn prediction
  • Store performance analytics and benchmarking
  • Campaign measurement and targeting
  • Category in-depth reviews and presentation to L1 leadership team

 

Required Skills and Experience

  • 3+ years of experience in data science, analytics, or consulting (preferably in the retail domain)
  • Proven ability to connect business questions to analytical solutions and communicate insights effectively
  • Strong SQL skills for data manipulation and querying large datasets
  • Advanced Python for statistical analysis, machine learning, and data processing
  • Intermediate PySpark / Databricks skills for working with big data
  • Comfortable with data visualisation tools (Power BI, Tableau, or similar)
  • Knowledge of statistical techniques (Hypothesis testing, ANOVA, regression, A/B testing, etc.)
  • Familiarity with agile project management tools (JIRA, Trello, etc.)

 

Good to Have

  • Experience designing data pipelines or analytical workflows in cloud environments (Azure preferred)
  • Strong understanding of retail KPIs (sales, margin, penetration, conversion, ATV, UPT, etc.)
  • Prior exposure to Promotion or Pricing analytics 
  • Dashboard development or reporting automation expertise


Read more
Hyderabad, Bengaluru (Bangalore)
5 - 12 yrs
₹25L - ₹35L / yr
skill iconC#
SQL
skill iconAmazon Web Services (AWS)
skill icon.NET
skill iconJava
+3 more

Senior Software Engineer

Location: Hyderabad, India


Who We Are:

Since our inception back in 2006, Navitas has grown to be an industry leader in the digital transformation space, and we’ve served as trusted advisors supporting our client base within the commercial, federal, and state and local markets.


What We Do:

At our very core, we’re a group of problem solvers providing our award-winning technology solutions to drive digital acceleration for our customers! With proven solutions, award-winning technologies, and a team of expert problem solvers, Navitas has consistently empowered customers to use technology as a competitive advantage and deliver cutting-edge transformative solutions.


What You’ll Do:

Build, Innovate, and Own:

  • Design, develop, and maintain high-performance microservices in a modern .NET/C# environment.
  • Architect and optimize data pipelines and storage solutions that power our AI-driven products.
  • Collaborate closely with AI and data teams to bring machine learning models into production systems.
  • Build integrations with external services and APIs to enable scalable, interoperable solutions.
  • Ensure robust security, scalability, and observability across distributed systems.
  • Stay ahead of the curve — evaluating emerging technologies and contributing to architectural decisions for our next-gen platform.

Responsibilities will include but are not limited to:

  • Provide technical guidance and code reviews that raise the bar for quality and performance.
  • Help create a growth-minded engineering culture that encourages experimentation, learning, and accountability.

What You’ll Need:

  • Bachelor’s degree in Computer Science or equivalent practical experience.
  • 8+ years of professional experience, including 5+ years designing and maintaining scalable backend systems using C#/.NET and microservices architecture.
  • Strong experience with SQL and NoSQL data stores.
  • Solid hands-on knowledge of cloud platforms (AWS, GCP, or Azure).
  • Proven ability to design for performance, reliability, and security in data-intensive systems.
  • Excellent communication skills and ability to work effectively in a global, cross-functional environment.

Set Yourself Apart With:

  • Startup experience - specifically in building product from 0-1
  • Exposure to AI/ML-powered systems, data engineering, or large-scale data processing.
  • Experience in healthcare or fintech domains.
  • Familiarity with modern DevOps practices, CI/CD pipelines, and containerization (Docker/Kubernetes).

Equal Employer/Veterans/Disabled

Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.

Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navita

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort