Cutshort logo
Snow flake schema Jobs in Chennai

8+ Snow flake schema Jobs in Chennai | Snow flake schema Job openings in Chennai

Apply to 8+ Snow flake schema Jobs in Chennai on CutShort.io. Explore the latest Snow flake schema Job opportunities across top companies like Google, Amazon & Adobe.

icon
Pluginlive

at Pluginlive

1 recruiter
Harsha Saggi
Posted by Harsha Saggi
Chennai, Mumbai
4 - 6 yrs
₹10L - ₹20L / yr
skill iconPython
SQL
NOSQL Databases
Data architecture
Data modeling
+7 more

Role Overview:

We are seeking a talented and experienced Data Architect with strong data visualization capabilities to join our dynamic team in Mumbai. As a Data Architect, you will be responsible for designing, building, and managing our data infrastructure, ensuring its reliability, scalability, and performance. You will also play a crucial role in transforming complex data into insightful visualizations that drive business decisions. This role requires a deep understanding of data modeling, database technologies (particularly Oracle Cloud), data warehousing principles, and proficiency in data manipulation and visualization tools, including Python and SQL.


Responsibilities:

  • Design and implement robust and scalable data architectures, including data warehouses, data lakes, and operational data stores, primarily leveraging Oracle Cloud services.
  • Develop and maintain data models (conceptual, logical, and physical) that align with business requirements and ensure data integrity and consistency.
  • Define data governance policies and procedures to ensure data quality, security, and compliance.
  • Collaborate with data engineers to build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and loading.
  • Develop and execute data migration strategies to Oracle Cloud.
  • Utilize strong SQL skills to query, manipulate, and analyze large datasets from various sources.
  • Leverage Python and relevant libraries (e.g., Pandas, NumPy) for data cleaning, transformation, and analysis.
  • Design and develop interactive and insightful data visualizations using tools like [Specify Visualization Tools - e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly] to communicate data-driven insights to both technical and non-technical stakeholders.
  • Work closely with business analysts and stakeholders to understand their data needs and translate them into effective data models and visualizations.
  • Ensure the performance and reliability of data visualization dashboards and reports.
  • Stay up-to-date with the latest trends and technologies in data architecture, cloud computing (especially Oracle Cloud), and data visualization.
  • Troubleshoot data-related issues and provide timely resolutions.
  • Document data architectures, data flows, and data visualization solutions.
  • Participate in the evaluation and selection of new data technologies and tools.


Qualifications:

  • Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field.
  • Proven experience (typically 5+ years) as a Data Architect, Data Modeler, or similar role. 

  • Deep understanding of data warehousing concepts, dimensional modeling (e.g., star schema, snowflake schema), and ETL/ELT processes.
  • Extensive experience working with relational databases, particularly Oracle, and proficiency in SQL.
  • Hands-on experience with Oracle Cloud data services (e.g., Autonomous Data Warehouse, Object Storage, Data Integration).
  • Strong programming skills in Python and experience with data manipulation and analysis libraries (e.g., Pandas, NumPy).
  • Demonstrated ability to create compelling and effective data visualizations using industry-standard tools (e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly).
  • Excellent analytical and problem-solving skills with the ability to interpret complex data and translate it into actionable insights. 
  • Strong communication and presentation skills, with the ability to effectively communicate technical concepts to non-technical audiences. 
  • Experience with data governance and data quality principles.
  • Familiarity with agile development methodologies.
  • Ability to work independently and collaboratively within a team environment.

Application Link- https://forms.gle/km7n2WipJhC2Lj2r5

Read more
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Chennai, Bengaluru (Bangalore)
5 - 10 yrs
₹15L - ₹26L / yr
Google Cloud Platform (GCP)
bigquery
Data modeling
Snow flake schema
OLTP
+1 more

1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, BQ optimization, Airflow/Composer, Python(preferred)/Java

2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At Least 2 databases)

5. Data Warehouse concepts - Beginner to Intermediate level

6.Data Modeling, GCP Databases, DB Schema(or similar)

7.Hands-on data modelling for OLTP and OLAP systems

8.In-depth knowledge of Conceptual, Logical and Physical data modelling

9.Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same

10.Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.

11.Should have working experience on at least one data modelling tool,

preferably DBSchema, Erwin

12Good understanding of GCP databases like AlloyDB, CloudSQL, and

BigQuery.

13.People with functional knowledge of the mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.

Read more
Indian Based IT Service Organization

Indian Based IT Service Organization

Agency job
via People First Consultants by Aishwarya KA
Chennai, Tirunelveli
5 - 7 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Greetings!!!!


We are looking for a data engineer for one of our premium clients for their Chennai and Tirunelveli location


Required Education/Experience


● Bachelor’s degree in computer Science or related field

● 5-7 years’ experience in the following:

● Snowflake, Databricks management,

● Python and AWS Lambda

● Scala and/or Java

● Data integration service, SQL and Extract Transform Load (ELT)

● Azure or AWS for development and deployment

● Jira or similar tool during SDLC

● Experience managing codebase using Code repository in Git/GitHub or Bitbucket

● Experience working with a data warehouse.

● Familiarity with structured and semi-structured data formats including JSON, Avro, ORC, Parquet, or XML

● Exposure to working in an agile work environment


Read more
AxionConnect Infosolutions Pvt Ltd
Shweta Sharma
Posted by Shweta Sharma
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur, Chennai
5.5 - 7 yrs
₹20L - ₹25L / yr
skill iconDjango
skill iconFlask
Snowflake
Snow flake schema
SQL
+4 more

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur

Notice period: Immediate - 15 days

 

1.      Python Developer with Snowflake

 

Job Description :


  1. 5.5+ years of Strong Python Development Experience with Snowflake.
  2. Strong hands of experience with SQL ability to write complex queries.
  3. Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
  4.  Development of Data Analysis, Data Processing engines using Python
  5. Good Experience in Data Transformation using Python. 
  6.  Experience in Snowflake data load using Python.
  7.  Experience in creating user-defined functions in Snowflake.
  8.  Snowsql implementation.
  9.  Knowledge of query performance tuning will be added advantage.
  10. Good understanding of Datawarehouse (DWH) concepts.
  11.  Interpret/analyze business requirements & functional specification
  12.  Good to have DBT, FiveTran, and AWS Knowledge.
Read more
Top IT MNC

Top IT MNC

Agency job
Chennai, Bengaluru (Bangalore), Kochi (Cochin), Coimbatore, Hyderabad, Pune, Kolkata, Noida, Gurugram, Mumbai
5 - 13 yrs
₹8L - ₹20L / yr
Snow flake schema
skill iconPython
snowflake
Greetings,

We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction
Read more
There is an urgent opening for Snowflake in MNC company.

There is an urgent opening for Snowflake in MNC company.

Agency job
via Volibits by Ankita Mishra
Pune, Mumbai, Bengaluru (Bangalore), Chennai, Noida, Hyderabad
7 - 12 yrs
₹5L - ₹15L / yr
Snow flake schema
SnowSQL
Snowpipe
There is an urgent requirement for Snowflake Developer in MNC company. Notice Period should be joining in April or 2nd week of May. Having skills like Snowpipe, SnowSQL, Snowflake schema & Snowflake developer.The Budget for this profile is upto 22 LPA.
Read more
enterprise-grade, streaming integration with intelligence pl

enterprise-grade, streaming integration with intelligence pl

Agency job
via Jobdost by Mamatha A
Chennai
5 - 15 yrs
₹15L - ₹30L / yr
skill iconJava
skill iconC++
Data Structures
SQL
Amazon RDS
+15 more

Striim (pronounced “stream” with two i’s for integration and intelligence) was founded in 2012 with a simple goal of helping companies make data useful the instant it’s born.

Striim’s enterprise-grade, streaming integration with intelligence platform makes it easy to build continuous, streaming data pipelines – including change data capture (CDC) – to power real-time cloud integration, log correlation, edge processing, and streaming analytics.

Strong Core Java / C++ experience

·       Excellent understanding of Logical ,Object-oriented design patterns, algorithms and data structures.

·       Sound knowledge of application access methods including authentication mechanisms, API quota limits, as well as different endpoint REST, Java etc

·       Strong exp in databases - not just a SQL Programmer but with knowledge of DB internals

·       Sound knowledge of Cloud database available as service is plus (RDS, CloudSQL, Google BigQuery, Snowflake )

·       Experience working in any cloud environment and microservices based architecture utilizing GCP, Kubernetes, Docker, CircleCI, Azure or similar technologies

·       Experience in Application verticals such as ERP, CRM, Sales with applications such as Salesforce, Workday, SAP  < Not Mandatory - added advantage >

·       Experience in building distributed systems  < Not Mandatory - added advantage >

·       Expertise on Data warehouse < Not Mandatory - added advantage >

·       Exp in developing & delivering product as SaaS i< Not Mandatory - added advantage 

Read more
netmedscom

at netmedscom

3 recruiters
Vijay Hemnath
Posted by Vijay Hemnath
Chennai
5 - 10 yrs
₹10L - ₹30L / yr
skill iconMachine Learning (ML)
Software deployment
CI/CD
Cloud Computing
Snow flake schema
+19 more

We are looking for an outstanding ML Architect (Deployments) with expertise in deploying Machine Learning solutions/models into production and scaling them to serve millions of customers. A candidate with an adaptable and productive working style which fits in a fast-moving environment.

 

Skills:

- 5+ years deploying Machine Learning pipelines in large enterprise production systems.

- Experience developing end to end ML solutions from business hypothesis to deployment / understanding the entirety of the ML development life cycle.
- Expert in modern software development practices; solid experience using source control management (CI/CD).
- Proficient in designing relevant architecture / microservices to fulfil application integration, model monitoring, training / re-training, model management, model deployment, model experimentation/development, alert mechanisms.
- Experience with public cloud platforms (Azure, AWS, GCP).
- Serverless services like lambda, azure functions, and/or cloud functions.
- Orchestration services like data factory, data pipeline, and/or data flow.
- Data science workbench/managed services like azure machine learning, sagemaker, and/or AI platform.
- Data warehouse services like snowflake, redshift, bigquery, azure sql dw, AWS Redshift.
- Distributed computing services like Pyspark, EMR, Databricks.
- Data storage services like cloud storage, S3, blob, S3 Glacier.
- Data visualization tools like Power BI, Tableau, Quicksight, and/or Qlik.
- Proven experience serving up predictive algorithms and analytics through batch and real-time APIs.
- Solid working experience with software engineers, data scientists, product owners, business analysts, project managers, and business stakeholders to design the holistic solution.
- Strong technical acumen around automated testing.
- Extensive background in statistical analysis and modeling (distributions, hypothesis testing, probability theory, etc.)
- Strong hands-on experience with statistical packages and ML libraries (e.g., Python scikit learn, Spark MLlib, etc.)
- Experience in effective data exploration and visualization (e.g., Excel, Power BI, Tableau, Qlik, etc.)
- Experience in developing and debugging in one or more of the languages Java, Python.
- Ability to work in cross functional teams.
- Apply Machine Learning techniques in production including, but not limited to, neuralnets, regression, decision trees, random forests, ensembles, SVM, Bayesian models, K-Means, etc.

 

Roles and Responsibilities:

Deploying ML models into production, and scaling them to serve millions of customers.

Technical solutioning skills with deep understanding of technical API integrations, AI / Data Science, BigData and public cloud architectures / deployments in a SaaS environment.

Strong stakeholder relationship management skills - able to influence and manage the expectations of senior executives.
Strong networking skills with the ability to build and maintain strong relationships with both business, operations and technology teams internally and externally.

Provide software design and programming support to projects.

 

 Qualifications & Experience:

Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Machine Learning Architect (Deployments) or a similar role for 5-7 years.

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort