2+ Data archiving Jobs in Mumbai | Data archiving Job openings in Mumbai
Apply to 2+ Data archiving Jobs in Mumbai on CutShort.io. Explore the latest Data archiving Job opportunities across top companies like Google, Amazon & Adobe.
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Job Description – Developer (ETL + Database)
Develop, document & Support ETL mappings, Database structures & BI reports.
Perform unit testing of developments done by him/her.
Participate in UAT process and ensure quick resolution of any UAT issue.
Manage different environments and be responsible for proper deployment of code in all client
environments.
Prepare release documents.
Prepare and Maintain project documents as advised by Team Leads.
Skill-sets:
3+ years of Hands on experience on ETL Pentaho Spoon Talend & MS SQL Server, Oracle & SYBASE Database tools.
Ability to write complex SQL and database procedures.
Good knowledge and understanding regarding Data warehouse Concepts, ETL Concepts, ETL
Loading Strategies, Data archiving, Data Reconciliation, ETL error handling etc.
Problem Solving.
Good communication skills – written and verbal.
Self-motivated, team player, action and result oriented.
Ability to successfully work under tight project schedule

