Cutshort logo

6+ JDBC Jobs in Mumbai | JDBC Job openings in Mumbai

Apply to 6+ JDBC Jobs in Mumbai on CutShort.io. Explore the latest JDBC Job opportunities across top companies like Google, Amazon & Adobe.

icon
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data engineering
Data architecture
SQL
Data modeling
GCS
+47 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data architecture
Data engineering
SQL
Data modeling
GCS
+21 more

Review Criteria

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred

  • Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Dremio?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate

  • Bachelor’s or master’s in computer science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.
Read more
a global provider of Business Process Management company

a global provider of Business Process Management company

Agency job
via Jobdost by Saida Pathan
Pune, Mumbai, Bengaluru (Bangalore), Chennai, Gurugram, Nashik
5 - 10 yrs
₹15L - ₹25L / yr
Appian
BPM
SQL
Sails.js
SQL Server Integration Services (SSIS)
+13 more
· Extensive experience in Appian BPM application development
· Knowledge of Appian architecture and its objects best practices
· Participate in analysis, design, and new development of Appian based applications
· Team leadership and provide technical leadership to Scrum teams
· Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced
by team
· Build applications: interfaces, process flows, expressions, data types, sites, integrations, etc.
· Proficient with SQL queries and with accessing data present in DB tables and views
· Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart
services, integration services and web services
· Experience working with different Appian Object types, query rules, constant rules and expression
rules
Qualifications
· At least 3 years of experience in Implementing BPM solutions using Appian 19.x or higher
· Over 8 years in Implementing IT solutions using BPM or integration technologies
· Certification Mandatory- L1 and L2 a
· Experience in Scrum/Agile methodologies with Enterprise level application development projects
· Good understanding of database concepts and strong working knowledge any one of the major
databases e g Oracle SQL Server MySQL
Additional information
Skills Required
· Appian BPM application development on version 19.x or higher
· Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
· Good leadership skills and the ability to lead a team of software engineers technically
· Experience working in Agile Scrum teams
· Good Communication skills
Read more
a global provider of Business Process Management company

a global provider of Business Process Management company

Agency job
via Jobdost by Saida Pathan
Pune, Mumbai, Bengaluru (Bangalore), Chennai
8 - 15 yrs
₹25L - ₹30L / yr
Appian
BPM
Agile/Scrum
Agile management
Oracle
+9 more
  • Position: Appian Tech Lead
  •  
  • Job Description:      
  •  
  • Extensive experience in Appian BPM application development
  • Knowledge of Appian architecture and its objects best practices
  • Participate in analysis, design, and new development of Appian based applications
  • Mandatory Team leadership and provide technical leadership to Scrum teams Certification Mandatory- L1, L2 or L3
  • Must be able to multi-task, work in a fast-paced environment and ability to resolve problems faced by team
  • Build applications: interfaces, process flows, expressions, data types, sites, integrations,
  • Proficient with SQL queries and with accessing data present in DB tables and views
  • Experience in Analysis, Designing process models, Records, Reports, SAIL, forms, gateways, smart services, integration services and web services
  • Experience working with different Appian Object types, query rules, constant rules and expression rules

 Qualifications:

  • At least 6 years of experience in Implementing BPM solutions using Appian 19.x or higher
  • Over 8 years in Implementing IT solutions using BPM or integration technologies
  • Experience in Scrum/Agile methodologies with Enterprise level application development projects
  • Good understanding of database concepts and strong working knowledge any one of the major databases e g Oracle SQL Server MySQL

Additional Information Skills Required

  • Appian BPM application development on version 19.x or higher
  • Experience of integrations using web services e g XML REST WSDL SOAP API JDBC JMS
  • Good leadership skills and the ability to lead a team of software engineers technically
  • Experience working in Agile Scrum teams
  • Good Communication skills

 

Read more
A leading company for Banks and Public Transport Operators

A leading company for Banks and Public Transport Operators

Agency job
via Tridat Technologies Pvt. Ltd. by Aasiya Waghoo
Mumbai
3 - 8 yrs
₹4L - ₹10L / yr
skill iconJava
Spring MVC
skill iconSpring Boot
Hibernate (Java)
JDBC
+2 more

Hello,

Greetings for the day !!!

Tridat Technologies is hiring "Java Developer" for one of the client based out @ Mumbai !!!

Experience: 3+yrs

Role: Java Developer

Desired Candidate Profile:                                                                                                   

  - Engineering / MCA / Graduate             

  - Good communication skill.     

  - Immediate joining preferable, Max 15 days

 

Role Requirements – 

  • 3+ years of experience developing software as an engineer.
  • Experience with developing, debugging, and shipping software products on large code bases that span platforms and tools
  • Significant experience building and operating critical high-scale systems.
  • Good software engineering methodology: meaningful and deep-rooted opinions about testing and code quality, ability to make sound quality/speed trade-offs.
  • Good technical skills in Java, Spring MVC ,Spring Boot, Hibernate, JDBC, JSP, Bootstrap, Relational Database, JMS, Active MQ
  • Lead from the front when the situation calls for it.

 Skills:

  • Significant experience building and operating critical high-scale systems.
  • Architecture: Knowledge of data structures and an eye for architecture. Can discuss the tradeoff between architectural choices, both on a theoretical level and on an applied level.
  • Strong coding/debugging abilities: Should have advanced knowledge of Java , Spring MVC ,Spring Boot, Hibernate, JDBC, JSP, Bootstrap, Relational Database, JMS, Active MQ
  • Fast learner: Thrive on learning new technologies. Should be able to adapt easily to meet the needs of our massive growth and rapidly evolving business environment.
  • Understands requirements beyond the written word: Whether working on an API used by other developers, an internal tool consumed by our operation teams, or a feature used by millions of customers, attention to details is important.

Employment Mode: COntract to hire 

Location: Mumbai (Pan India candidates will do)

Joining Period: Immediate to 15 days
Read more
Future Group

at Future Group

3 recruiters
Kitty Basumatary
Posted by Kitty Basumatary
Bengaluru (Bangalore), Mumbai
3 - 5 yrs
₹12L - ₹18L / yr
skill iconJava
J2EE
Spring
Hibernate (Java)
skill iconSpring Boot
+26 more

Required Qualifications and Skills:

  • 3-5 years of work experience in the development background, with at least 2 years experience in Java, Spring, Sprint Boot, Hibernate or JPA, MySQL, Oracle, Spring MVC.
  • B.E. degree in Computer Science, Graduate in Software Engineering or equivalent
  • Experience in Core JAVA, Spring, Spring Boot Frameworks.
  • Experience with ORM's like Hibernate.
  • Good knowledge of developing RESTful web services using Spring Boot, Java1.x,Servlet2.4, JSP2.0, JDBC3.0, Java Mail, Struts2.x, HTML, HTML5, Angular7+, JavaScript, JSF, Bootstrap2.x-3.x, JQuery & CSS 3.x, Maven 3.x, Apache Tomcat7
  • Knowledge of Cloud AWS.
  • Experience in any Messaging Queue e.g. Apache Kafka, ActiveMQ, etc.
  • Experience on Web services with REST and SOAP.
  • Experience working on tool set like Eclipse IDE, SQL clients.
  • Experience using application server like Jboss, TomCat, Wildfly, glassfish.
  • Experience in using tools like SOAP UI, POSTMAN
  • Ability to write SQL queries to fetch data.
  • Knowledge of Micro services, Redis Cache and Mongo DB (or any other NoSQL) is good to have
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort