5+ Snow flake schema Jobs in Delhi, NCR and Gurgaon | Snow flake schema Job openings in Delhi, NCR and Gurgaon
Apply to 5+ Snow flake schema Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Snow flake schema Job opportunities across top companies like Google, Amazon & Adobe.
Job Title: Data Engineer
Location City: Gurugram
Industry: Research and Advisory Services
Role Overview
We are looking for a Senior Data Engineer (7–10 years) to play a foundational role in building Everest Group’s greenfield, Snowflake-based enterprise data platform.
This role is hands-on and ownership-driven, with a strong focus on:
Ingesting data from enterprise SaaS platforms Building scalable Snowflake ELT pipelines
Designing analytics-ready data models Owning the initial Snowflake platform foundations in collaboration with architecture leadership The ideal candidate has deep experience integrating CRM and marketing systems via APIs, is comfortable operating production- grade data pipelines, and can make sound decisions around performance, cost, and reliability.
Key Responsibilities
Robust Data Ingestion Pipelines From Enterprise SaaS Platforms, Including
Salesforce (CRM)
NetSuite (Finance)
Marketing and RevOps tools such as Marketo, 6sense, Gong
SharePoint (files, metadata, permissions)
Develop API-based Ingestion Frameworks Handling
Authentication and authorization
Pagination, rate limits, retries, and failures
Incremental loads, soft deletes, and historical tracking
Schema evolution and upstream source changes
ELT pipelines within Snowflake Write high-quality, optimized SQL for complex
transformations Build and manage data layers including raw, staged, and curated datasets
Optimize Snowflake warehouses, storage, and query performance with a strong focus on cost efficiency
Models Including
Fact and dimension tables
Star and snowflake schemas
Slowly Changing Dimensions (SCD Type 1 and Type 2) Ensure data models support reporting, dashboards, and research analytics Partner with analytics and research teams to deliver analytics-ready, well documented datasets reliability including scheduling, monitoring, alerting, and recovery Implement data quality checks for accuracy, completeness, and freshness
Support Snowflake Platform Foundations Including
Warehouse and environment strategy (dev/test/prod)
Role-based access control (RBAC) Secure handling of sensitive HR and finance data (PII) Troubleshoot and resolve data issues across ingestion, transformation, and consumption layers research, product, and technology stakeholders to translate business needs into data solutions Contribute to data platform architecture discussions and continuous improvement initiatives
Maintain clear documentation for pipelines, data models, and data flows
Follow modern engineering practices including Git-based version control and CI/CD workflows
Education And Experience
Bachelor’s or master’s degree in computer science, Information Technology, Engineering, Mathematics, or related field.
7–10 years of hands-on experience in data engineering or similar roles
Strong hands-on expertise with Snowflake, including ingestion, transformations, and performance optimization
Proven experience ingesting data from SaaS platforms via APIs (HR, CRM, or Marketing systems)
Advanced SQL skills and strong understanding of relational databases and data modeling
Strong Python experience for API integration, data ingestion, and automation Experience with cloud platforms (Azure preferred; AWS/GCP acceptable)
Experience with orchestration or transformation tools such as dbt, Azure Data Factory, or similar
Strong problem-solving skills, ownership mindset, and attention to detail
About the Role:
We are seeking a talented Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.
Job Title: Tableau BI Developer
Years of Experience: 4-8Yrs
12$ per hour fte engagement
8 hrs. working
Required Skills & Experience:
✅ 4–8 years of experience in BI development and data engineering
✅ Expertise in BigQuery and/or Snowflake for large-scale data processing
✅ Strong SQL skills with experience writing complex analytical queries
✅ Experience in creating dashboards in tools like Power BI, Looker, or similar
✅ Hands-on experience with ETL/ELT tools and data pipeline orchestration
✅ Familiarity with cloud platforms (GCP, AWS, or Azure)
✅ Strong understanding of data modeling, data warehousing, and analytics best practices
✅ Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders
We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction

