8+ Snow flake schema Jobs in Mumbai | Snow flake schema Job openings in Mumbai
Apply to 8+ Snow flake schema Jobs in Mumbai on CutShort.io. Explore the latest Snow flake schema Job opportunities across top companies like Google, Amazon & Adobe.
Strong Snowflake Data Architect profile (Cloud Data Platform / AI-led Data Transformation)
Mandatory (Experience 1) – Must have 8+ years of experience in Data Engineering / Data Architecture, with strong focus on building enterprise-scale data platforms
Mandatory (Experience 2) – Must have 3+ years of deep hands-on experience in Snowflake architecture, including designing and implementing scalable data warehouse solutions
Mandatory (Experience 3) – Strong expertise in Snowflake features including Resource Monitors, RBAC, Virtual Warehouses, Time Travel, Zero Copy Clone, and query performance optimization
Mandatory (Experience 4) – Proven experience building and managing data ingestion pipelines using Snowpipe, handling structured, semi-structured (JSON, XML), and columnar data formats (Parquet)
Mandatory (Experience 5) – Strong experience in cloud ecosystem, preferably AWS, including S3, Lambda, EC2, Redshift, and integration with Snowflake-based architectures
Mandatory (Experience 6) – Proven experience in migrating data from on-premise or legacy systems to Snowflake, including data modeling, transformation, and validation
Mandatory (Experience 7) – Hands-on experience in SQL, SnowSQL, Python, or PySpark for data transformation, automation, and monitoring
Mandatory (Experience 8) – Experience in data modeling, partitioning, micro-partitions, and re-clustering strategies in Snowflake
Mandatory (Experience 9) – Must have experience working in client-facing or consulting roles, including requirement gathering, solution design, and stakeholder communication
Mandatory (Skill 1) – Strong understanding of end-to-end data architecture including ETL/ELT pipelines, data lakes, and warehouse integration
Mandatory (Skill 2) – Experience in designing monitoring and automation frameworks using Python, Bash, or similar tools
Mandatory (Skill 3) – Ability to translate business requirements into scalable technical solutions and define future-state data architecture roadmaps
Mandatory (Note) – Only immediate joiners or candidates who can join within 15 days
JD -
We are looking for a strong Data Engineer having hands on experience in building pipelines using Snowflake and DBT.
Key Responsibilities:
- Develop, maintain, and optimize data pipelines using DBT and SQL on Snowflake DB.
- Collaborate with data analysts, QA and business teams to build scalable data models.
- Implement data transformations, testing, and documentation within the DBT framework.
- Work on Snowflake for data warehousing tasks, including data ingestion, query optimization, and performance tuning.
- Use Python (preferred) for automation, scripting, and additional data processing as needed.
Required Skills:
- 6+ years of experience in building data engineering pipelines.
- Strong hands-on expertise with DBT and advanced SQL.
- Experience working with modern columnar/MPP data warehouses, preferably Snowflake.
- Knowledge of Python for data manipulation and workflow automation (preferred).
- Good understanding of data modeling concepts, ETL/ELT processes, and best practice.
Strong Enterprise Data Modeller profile (Modern Data Platforms)
Mandatory (Experience 1) – Must have 7+ years of experience in Data Modeling or Enterprise Data Architecture, with strong hands-on expertise in designing conceptual, logical, and physical data models for enterprise data platforms
Mandatory (Experience 2) – Must have Strong hands-on experience with enterprise data modeling tools such as Erwin, ER/Studio, PowerDesigner, SQLDBM, or similar enterprise data modeling tools
Mandatory (Experience 3) – Must have Deep understanding of dimensional modeling (Kimball / Inmon methodologies), normalization techniques, and schema design for modern data warehouse environments.
Mandatory (Experience 4) – Proven experience designing data models for modern data platforms such as Snowflake, Databricks, Redshift, Dremio, or similar cloud data warehouse / lakehouse systems.
Mandatory (Experience 5) – Must have strong SQL expertise and schema design skills, with the ability to validate data model implementations and collaborate closely with data engineering teams
Mandatory (Education) – Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related technical field.
Mandatory (Note) – Total experience should not be greater than 14 years
Role Overview:
We are seeking a talented and experienced Data Architect with strong data visualization capabilities to join our dynamic team in Mumbai. As a Data Architect, you will be responsible for designing, building, and managing our data infrastructure, ensuring its reliability, scalability, and performance. You will also play a crucial role in transforming complex data into insightful visualizations that drive business decisions. This role requires a deep understanding of data modeling, database technologies (particularly Oracle Cloud), data warehousing principles, and proficiency in data manipulation and visualization tools, including Python and SQL.
Responsibilities:
- Design and implement robust and scalable data architectures, including data warehouses, data lakes, and operational data stores, primarily leveraging Oracle Cloud services.
- Develop and maintain data models (conceptual, logical, and physical) that align with business requirements and ensure data integrity and consistency.
- Define data governance policies and procedures to ensure data quality, security, and compliance.
- Collaborate with data engineers to build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and loading.
- Develop and execute data migration strategies to Oracle Cloud.
- Utilize strong SQL skills to query, manipulate, and analyze large datasets from various sources.
- Leverage Python and relevant libraries (e.g., Pandas, NumPy) for data cleaning, transformation, and analysis.
- Design and develop interactive and insightful data visualizations using tools like [Specify Visualization Tools - e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly] to communicate data-driven insights to both technical and non-technical stakeholders.
- Work closely with business analysts and stakeholders to understand their data needs and translate them into effective data models and visualizations.
- Ensure the performance and reliability of data visualization dashboards and reports.
- Stay up-to-date with the latest trends and technologies in data architecture, cloud computing (especially Oracle Cloud), and data visualization.
- Troubleshoot data-related issues and provide timely resolutions.
- Document data architectures, data flows, and data visualization solutions.
- Participate in the evaluation and selection of new data technologies and tools.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field.
- Proven experience (typically 5+ years) as a Data Architect, Data Modeler, or similar role.
- Deep understanding of data warehousing concepts, dimensional modeling (e.g., star schema, snowflake schema), and ETL/ELT processes.
- Extensive experience working with relational databases, particularly Oracle, and proficiency in SQL.
- Hands-on experience with Oracle Cloud data services (e.g., Autonomous Data Warehouse, Object Storage, Data Integration).
- Strong programming skills in Python and experience with data manipulation and analysis libraries (e.g., Pandas, NumPy).
- Demonstrated ability to create compelling and effective data visualizations using industry-standard tools (e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly).
- Excellent analytical and problem-solving skills with the ability to interpret complex data and translate it into actionable insights.
- Strong communication and presentation skills, with the ability to effectively communicate technical concepts to non-technical audiences.
- Experience with data governance and data quality principles.
- Familiarity with agile development methodologies.
- Ability to work independently and collaboratively within a team environment.
Application Link- https://forms.gle/km7n2WipJhC2Lj2r5
- Strong Snowflake Cloud database experience Database developer.
- Knowledge of Spark and Databricks is desirable.
- Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture
- Familiar with technologies relevant to data lakes such as Snowflake
- Candidate should have strong ETL & database design/modelling skills.
- Experience creating data pipelines
- Strong SQL skills and debugging knowledge and Performance Tuning exp.
- Experience with Databricks / Azure is add on /good to have .
- Experience working with global teams and global application environments
- Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired
- +2 years experience working as a react.js developer.
- In-depth knowledge of JavaScript, CSS, HTML, and front-end languages.
- Knowledge of REACT tools including React.js, Redux, Material UI.
- Experience with user interface design & user experience design
- Knowledge of performance testing frameworks including Mocha and Jest.
- Experience with browser-based debugging and performance testing software.
- Excellent troubleshooting skills.
- Good project management skills.
- Developing applications in React including component design and state management for specific use cases
- Experience working with at least one SQL and NoSQL Database (MongoDB, SQL Server, Snowflake, Postgres preferred)
- Basic experience with AWS platform
We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction




