3+ Dimensional modeling Jobs in India
Apply to 3+ Dimensional modeling Jobs on CutShort.io. Find your next job, effortlessly. Browse Dimensional modeling Jobs and apply today!
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
Advanced SQL, data modeling skills - designing Dimensional Layer, 3NF, denormalized views & semantic layer, Expertise in GCP services
Role & Responsibilities:
● Design and implement robust semantic layers for data systems on Google Cloud Platform (GCP)
● Develop and maintain complex data models, including dimensional models, 3NF structures, and denormalized views
● Write and optimize advanced SQL queries for data extraction, transformation, and analysis
● Utilize GCP services to create scalable and efficient data architectures
● Collaborate with cross-functional teams to translate business requirements(specified in mapping sheets or Legacy
Datastage jobs) into effective data models
● Implement and maintain data warehouses and data lakes on GCP
● Design and optimize ETL/ELT processes for large-scale data integration
● Ensure data quality, consistency, and integrity across all data models and semantic layers
● Develop and maintain documentation for data models, semantic layers, and data flows
● Participate in code reviews and implement best practices for data modeling and database design
● Optimize database performance and query execution on GCP
● Provide technical guidance and mentorship to junior team members
● Stay updated with the latest trends and advancements in data modeling, GCP services, and big data technologies
● Collaborate with data scientists and analysts to enable efficient data access and analysis
● Implement data governance and security measures within the semantic layer and data model
Responsibilities
- Design and implement Azure BI infrastructure, ensure overall quality of delivered solution
- Develop analytical & reporting tools, promote and drive adoption of developed BI solutions
- Actively participate in BI community
- Establish and enforce technical standards and documentation
- Participate in daily scrums
- Record progress daily in assigned Devops items
Ideal Candidates should have
- 5 + years of experience in a similar senior business intelligence development position
- To be successful in the role you will require a high level of expertise across all facets of the Microsoft BI stack and prior experience in designing and developing well-performing data warehouse solutions
- Demonstrated experience using development tools such as Azure SQL database, Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure DevOps.
- Experience with development methodologies including Agile, DevOps, and CICD patterns
- Strong oral and written communication skills in English
- Ability and willingness to learn quickly and continuously
- Bachelor's Degree in computer science


