3+ Snow flake schema Jobs in Kochi (Cochin) | Snow flake schema Job openings in Kochi (Cochin)
Apply to 3+ Snow flake schema Jobs in Kochi (Cochin) on CutShort.io. Explore the latest Snow flake schema Job opportunities across top companies like Google, Amazon & Adobe.
JOB DETAILS:
* Job Title: Associate III - Azure Data Engineer
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4 -6 years
* Location: Trivandrum, Kochi
Job Description: Azure Data Engineer (4–6 Years Experience)
Job Type: Full-time
Locations: Kochi, Trivandrum
Must-Have Skills
Azure & Data Engineering
- Azure Data Factory (ADF)
- Azure Databricks (PySpark)
- Azure Synapse Analytics
- Azure Data Lake Storage Gen2
- Azure SQL Database
Programming & Querying
- Python (PySpark)
- SQL / Spark SQL
Data Modelling
- Star & Snowflake schema
- Dimensional modelling
Source Systems
- SQL Server
- Oracle
- SAP
- REST APIs
- Flat files (CSV, JSON, XML)
CI/CD & Version Control
- Git
- Azure DevOps / GitHub Actions
Monitoring & Scheduling
- ADF triggers
- Databricks jobs
- Log Analytics
Security
- Managed Identity
- Azure Key Vault
- Azure RBAC / Access Control
Soft Skills
- Strong analytical & problem-solving skills
- Good communication and collaboration
- Ability to work in Agile/Scrum environments
- Self-driven and proactive
Good-to-Have Skills
- Power BI basics
- Delta Live Tables
- Synapse Pipelines
- Real-time processing (Event Hub / Stream Analytics)
- Infrastructure as Code (Terraform / ARM templates)
- Data governance tools like Azure Purview
- Azure Data Engineer Associate (DP-203) certification
Educational Qualifications
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Skills: Azure Data Factory, Azure Databricks, Azure Synapse, Azure Data Lake Storage
Must-Haves
Azure Data Factory (4-6 years), Azure Databricks/PySpark (4-6 years), Azure Synapse Analytics (4-6 years), SQL/Spark SQL (4-6 years), Git/Azure DevOps (4-6 years)
Skills: Azure, Azure data factory, Python, Pyspark, Sql, Rest Api, Azure Devops
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
JOB DETAILS:
* Job Title: Associate III - Data Engineering
* Industry: Global digital transformation solutions provide
* Salary: Best in Industry
* Experience: 4-6 years
* Location: Trivandrum, Kochi
Job Description
Job Title:
Data Services Engineer – AWS & Snowflake
Job Summary:
As a Data Services Engineer, you will be responsible for designing, developing, and maintaining robust data solutions using AWS cloud services and Snowflake.
You will work closely with cross-functional teams to ensure data is accessible, secure, and optimized for performance.
Your role will involve implementing scalable data pipelines, managing data integration, and supporting analytics initiatives.
Responsibilities:
• Design and implement scalable and secure data pipelines on AWS and Snowflake (Star/Snowflake schema)
• Optimize query performance using clustering keys, materialized views, and caching
• Develop and maintain Snowflake data warehouses and data marts.
• Build and maintain ETL/ELT workflows using Snowflake-native features (Snowpipe, Streams, Tasks).
• Integrate Snowflake with cloud platforms (AWS, Azure, GCP) and third-party tools (Airflow, dbt, Informatica)
• Utilize Snowpark and Python/Java for complex transformations
• Implement RBAC, data masking, and row-level security.
• Optimize data storage and retrieval for performance and cost-efficiency.
• Collaborate with stakeholders to gather data requirements and deliver solutions.
• Ensure data quality, governance, and compliance with industry standards.
• Monitor, troubleshoot, and resolve data pipeline and performance issues.
• Document data architecture, processes, and best practices.
• Support data migration and integration from various sources.
Qualifications:
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• 3 to 4 years of hands-on experience in data engineering or data services.
• Proven experience with AWS data services (e.g., S3, Glue, Redshift, Lambda).
• Strong expertise in Snowflake architecture, development, and optimization.
• Proficiency in SQL and Python for data manipulation and scripting.
• Solid understanding of ETL/ELT processes and data modeling.
• Experience with data integration tools and orchestration frameworks.
• Excellent analytical, problem-solving, and communication skills.
Preferred Skills:
• AWS Glue, AWS Lambda, Amazon Redshift
• Snowflake Data Warehouse
• SQL & Python
Skills: Aws Lambda, AWS Glue, Amazon Redshift, Snowflake Data Warehouse
Must-Haves
AWS data services (4-6 years), Snowflake architecture (4-6 years), SQL (proficient), Python (proficient), ETL/ELT processes (solid understanding)
Skills: AWS, AWS lambda, Snowflake, Data engineering, Snowpipe, Data integration tools, orchestration framework
Relevant 4 - 6 Years
python is mandatory
******
Notice period - 0 to 15 days only (Feb joiners’ profiles only)
Location: Kochi
F2F Interview 7th Feb
We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction

