Cutshort logo
Data security Jobs in Mumbai

6+ Data security Jobs in Mumbai | Data security Job openings in Mumbai

Apply to 6+ Data security Jobs in Mumbai on CutShort.io. Explore the latest Data security Job opportunities across top companies like Google, Amazon & Adobe.

icon
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data engineering
Data architecture
SQL
Data modeling
GCS
+47 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
Alliance Recruitment Agency
Raveena Korani
Posted by Raveena Korani
Mumbai, Navi Mumbai
2 - 5 yrs
₹2L - ₹5L / yr
Data security
Vendor Management
Technical support

1. IT Operations - 

• Manage and maintain computer systems, servers, networks, printers, CCTV, and internet connectivity

• Provide technical support to employees for hardware, software, and network-related issues

• Ensure uptime and performance of all IT infrastructure

2. System & Software Management -

• Install, configure, and maintain operating systems, ERP /CRM Tools, HRMS, Accounting software, and licensed applications.

• Coordinate with software vendors for implementation, updates, and troubleshooting

• Maintain user access controls and system permissions

3. Data Security & Compliance -

• Ensure data backup, recovery, and cybersecurity best practices

• Monitor antivirus, firewall, and system security protocols

• Maintain confidentiality and protection of employee and customer data

4. IT Asset & Vendor Management -

• Maintain IT Asset Inventory (Laptops, Desktops, Routers, Licenses, etc)

• Coordinate with external IT service providers, ISPs, and hardware vendors

• Manage AMC agreements and service renewals

5. Process Improvement & Reporting -

• Support digitalization initiatives and automation projects

• Document IT SOPs, troubleshooting guides, and system workflows

• Prepare periodic IT reports for management

=> Key Performance Indicators (KPIs)

1) System uptime and response time

2) Resolution time for IT Issues

3) Data security compliance

4) User satisfaction feedback

Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 17 yrs
₹30L - ₹45L / yr
Data architecture
Data engineering
SQL
Data modeling
GCS
+21 more

Review Criteria

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred

  • Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Dremio?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate

  • Bachelor’s or master’s in computer science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.
Read more
It is a cyber technology company in Navi,Mumbai

It is a cyber technology company in Navi,Mumbai

Agency job
Mumbai, Navi Mumbai, Kharghar, Panvel
2 - 9 yrs
₹4L - ₹21L / yr
ISO 27001, ISO 22301 and achieve
GRC
Data security
Cyber Security
Risk Management
+1 more

Job Description:

We are seeking a highly skilled and motivated GRC Consultant to play a pivotal role in

delivering projects for the implementation of the Governance, Risk, and Compliance framework.

The ideal candidate will take ownership of risk management, compliance monitoring, and

contribute to strategic enhancements for clients.

Key Responsibilities:

● Take a lead role in the ongoing development and enhancement of the GRC

framework.

● Drive the implementation of policies and procedures as required by various

information security/privacy/data security frameworks.

● Implement frameworks such as ISO 27001, ISO 22301, etc., and achieve client

certification.

Risk Management:

● Lead the identification, assessment, and management of risks across diverse

business units.

● Conduct thorough risk assessments and provide strategic recommendations.

● Understand compliance requirements with laws and regulations concerning

information security and privacy.

Training and Leadership:

● Conduct training and awareness sessions for end users and client SPOCs on

information and cybersecurity requirements.

Qualifications:

● Bachelor’s degree in IT or a related field.

● Excellent communication and leadership abilities.

● Candidates with a cybersecurity background only.

● Minimum 2 yrs experience in cybersecurity

Read more
A global IT risk management (IRM) company

A global IT risk management (IRM) company

Agency job
via Selective Global Search by Khyati Dhall
Mumbai
5 - 7 yrs
₹5L - ₹11L / yr
DLP
Data loss prevention
Zscalar
Data security
  • Security Engineer (experience 5 to 7 years)
    • Experience in managing and administering Zscalar DLP
    o Primary responsibility is to manage Data Loss Prevention solution who has experience in analyzing the DLP logs, defining relevant policies, understanding data exfiltration techniques, etc)
    • Solution: Zscalar
    • Location: Mumbai
Read more
With a global provider of Business Process Management.

With a global provider of Business Process Management.

Agency job
via Jobdost by Mamatha A
Bengaluru (Bangalore), Mumbai, Gurugram, Nashik, Pune, Visakhapatnam, Chennai, Noida
3 - 5 yrs
₹8L - ₹12L / yr
Oracle Analytics
OAS
OAC
Oracle OAS
Oracle
+8 more

Oracle OAS Developer

 

 

Senior OAS/OAC (Oracle analytics) designer and developer having 3+ years of experience. Worked on new Oracle Analytics platform. Used latest features, custom plug ins and design new one using Java. Has good understanding about the various graphs data points and usage for appropriate financial data display. Worked on performance tuning and build complex data security requirements.

Qualifications



Bachelor university degree in Engineering/Computer Science.

Additional information

Have knowledge of Financial and HR dashboard

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort