Cutshort logo
Our Client company is into Computer Software. (EC1) logo
Senior Data Architect
Our Client company is into Computer Software. (EC1)
Senior Data Architect
Our Client company is into Computer Software. (EC1)'s logo

Senior Data Architect

at Our Client company is into Computer Software. (EC1)

Agency job
14 - 18 yrs
₹36.4L - ₹38.7L / yr
Bengaluru (Bangalore)
Skills
ADF
Snowflake
Data Architect
Data architecture
Data engineering
  • Establish and maintain a trusted advisor relationship within the company’s IT, Commercial Digital Solutions, Functions, and Businesses you interact with
  • Establish and maintain close working relationships with teams responsible for delivering solutions to the company’s businesses and functions
  • Perform key management and thought leadership in the areas of advanced data techniques, including data modeling, data access, data integration, data visualization, big data solutions, text mining, data discovery, statistical methods, and database design
  • Work with business partners to define ways to leverage data to develop platforms and solutions to drive business growth
  • Engage collaboratively with project teams to support project objectives through the application of sound data architectural principles; support the project with knowledge of existing data assets and provide guidance on reusable data structures
  • Share knowledge of external and internal data capabilities and trends, provide leadership, and facilitate the evaluation of vendors and products
  • Utilize advanced data analysis, including statistical analysis and data mining techniques
  • Collaborate with others to set an enterprise data vision with solid recommendations, and work to gain business and IT consensus

Basic Qualifications

  • Overall 15+ years of IT Environment Experience.
  • Solid background as a data architect/cloud architect with a minimum of 5 years as a core architect
  • Architecting experience in the cloud data warehouse solutions (Snowflake preferred, big query/redshift/synapse analytics-nice to have)
  • Strong architecture and design skills using Azure Services( Like ADF/Data Flows/Evengrids/IOTHUB/EvenHub/ADLS Gen2 /Serverless Azure Functions/Logic Apps/Azure Analysis Services Cube Design patterns/Azure SQL Db)
  • Working Knowledge on Lambda/Kappa frameworks within the data Lake designs& architecture solutions
  • Deep understanding of the DevOps/DataOps patterns
  • Architecting the semantic models
  • Data modeling experience with Data vault principles
  • Cloud-native Batch & Realtime ELT/ETL patterns
  • Familiarity with Lucid Chart/Visio
  • Logging & monitoring pattern designs in the data lake /data warehouse context
  • Meta data-driven design patterns * solutioning expertise
  • Data Catalog integration experience in the data lake designs
  • 3+ years of experience partnering with business managers to develop technical strategies and architectures to support their objectives
  • 2+ years of hands-on experience with analytics deployment in the cloud (prefer Azure but AWS knowledge is acceptable)
  • 5+ years of delivering analytics in modern data architecture (Hadoop, Massively Parallel Processing Database Platforms, and Semantic Modeling)
  • Demonstrable knowledge of ETL and ELT patterns and when to use either one; experience selecting among different tools that could be leveraged to accomplish this (Talend, Informatica, Azure Data Factory, SSIS, SAP Data Services)
  • Demonstrable knowledge of and experience with different scripting languages (python, javascript, PIG, or object-oriented programming like Java or . NET)

Preferred Qualifications

  • Bachelor’s degree in Computer Science, MIS, related field, or equivalent experience
  • Experience working with solutions delivery teams using Agile/Scrum or similar methodologies
  • 2+ years of experience designing solutions leveraging Microsoft Cortana Intelligence Suite of Products [Azure SQL, Azure SQL DW, Cosmos DB, HDInsight, DataBricks]
  • Experience with enterprise systems, like CRM, ERP, Field Services Management, Supply Chain solutions, HR systems
  • Ability to work independently, establishing strategic objectives, project plans, and milestones
  • Exceptional written, verbal & presentation skills

 

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

Similar jobs

Sahaj AI Software
at Sahaj AI Software
1 video
6 recruiters
Pooja  Mandlekar
Posted by Pooja Mandlekar
Bengaluru (Bangalore), Pune
10 - 20 yrs
₹10L - ₹35L / yr
HDFS
Apache HBase
databricks
snowflake

Job Overview:

We are looking for a highly skilled Senior Data Engineer with extensive software development experience and deep expertise in large data platforms. The ideal candidate will have hands-on experience with HDFS, HBase, Data Bricks, and Snowflake, and will play a pivotal role in designing, implementing, and optimizing our data infrastructure.



Read more
client of Merito
client of Merito
Agency job
via Merito by Merito Talent
Mumbai
3 - 8 yrs
Best in industry
skill iconPython
SQL
Tableau
PowerBI
skill iconPHP
+2 more

Our client is the world’s largest media investment company and are a part of WPP. In fact, they are responsible for one in every three ads you see globally. We are currently looking for a Senior Software Engineer to join us. In this role, you will be responsible for coding/implementing of custom marketing applications that Tech COE builds for its customer and managing a small team of developers.

 

What your day job looks like:

  • Serve as a Subject Matter Expert on data usage – extraction, manipulation, and inputs for analytics
  • Develop data extraction and manipulation code based on business rules
  • Develop automated and manual test cases for the code written
  • Design and construct data store and procedures for their maintenance
  • Perform data extract, transform, and load activities from several data sources.
  • Develop and maintain strong relationships with stakeholders
  • Write high quality code as per prescribed standards.
  • Participate in internal projects as required

 
Minimum qualifications:

  • B. Tech./MCA or equivalent preferred
  • Excellent 3 years Hand on experience on Big data, ETL Development, Data Processing.


    What you’ll bring:

  • Strong experience in working with Snowflake, SQL, PHP/Python.
  • Strong Experience in writing complex SQLs
  • Good Communication skills
  • Good experience of working with any BI tool like Tableau, Power BI.
  • Sqoop, Spark, EMR, Hadoop/Hive are good to have.

 

 

Read more
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
Agency job
via Jobdost by Sathish Kumar
Ahmedabad, Hyderabad, Pune, Delhi
5 - 7 yrs
₹18L - ₹25L / yr
AWS Lambda
AWS Simple Notification Service (SNS)
AWS Simple Queuing Service (SQS)
skill iconPython
PySpark
+9 more
  1. Data Engineer

 Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON

Mandatory Requirements  

  • Experience in AWS Glue
  • Experience in Apache Parquet 
  • Proficient in AWS S3 and data lake 
  • Knowledge of Snowflake
  • Understanding of file-based ingestion best practices.
  • Scripting language - Python & pyspark 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS 
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language: Java, Scala, Python, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  
  • Proficient with
  • Data mining/programming tools (e.g. SAS, SQL, R, Python)
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
  • Data visualization (e.g. Tableau, Looker, MicroStrategy)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

  

Familiarity and experience in the following is a plus:  

  • AWS certification
  • Spark Streaming 
  • Kafka Streaming / Kafka Connect 
  • ELK Stack 
  • Cassandra / MongoDB 
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Chennai
4 - 7 yrs
₹13L - ₹15L / yr
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+10 more

Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA

Responsibilities:

  • Parse data using Python, create dashboards in Tableau.
  • Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
  • Migrate Datastage jobs to Snowflake, optimize performance.
  • Work with HDFS, Hive, Kafka, and basic Spark.
  • Develop Python scripts for data parsing, quality checks, and visualization.
  • Conduct unit testing and web application testing.
  • Implement Apache Airflow and handle production migration.
  • Apply data warehousing techniques for data cleansing and dimension modeling.

Requirements:

  • 4+ years of experience as a Platform Engineer.
  • Strong Python skills, knowledge of Tableau.
  • Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
  • Proficient in Unix Shell Scripting and SQL.
  • Familiarity with ETL tools like DataStage and DMExpress.
  • Understanding of Apache Airflow.
  • Strong problem-solving and communication skills.

Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.

Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Snow flake schema
Snowflake
+5 more

Job Title: AWS-Azure Data Engineer with Snowflake

Location: Bangalore, India

Experience: 4+ years

Budget: 15 to 20 LPA

Notice Period: Immediate joiners or less than 15 days

Job Description:

We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.

Responsibilities:

  1. Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
  2. Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
  3. Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
  4. Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
  5. Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
  6. Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
  7. Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
  8. Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
  9. Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
  10. Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.

Requirements:

  1. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  2. Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
  3. Strong proficiency in data modelling, ETL development, and data integration.
  4. Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
  5. In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
  6. Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
  7. Familiarity with data governance principles and security best practices.
  8. Strong problem-solving skills and ability to work independently in a fast-paced environment.
  9. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
  10. Immediate joiner or notice period less than 15 days preferred.

If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
Talent folks
at Talent folks
2 recruiters
Agency job
via Talent folks by Rijooshri Saikia
Remote only
3 - 6 yrs
₹8L - ₹10L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more
Data Platform Operations
Remote Work, US shift

General Scope and Summary

The Data and Analytics Team sits in the Digital and Enterprise Capabilities Group and is responsible for driving the strategy, implementation and delivery of Data,
Analytics and Automation capabilities across Enterprise.
This global team will deliver “Next-Gen Value” by establishing core Data and Analytics capabilities needed to effectively manage and exploit Data as an Enterprise Asset. Data Platform Operations will be responsible for implementing and supporting Enterprise Data Operations tools and capabilities which will enable teams
 to answer strategic and business questions through data .

Roles and Responsibilities

● Manage overall data operations ensuring adherence to data quality metrics by establishing standard operating procedures and best practices/playbooks.
● Champion the advocacy and adoption of enterprise data assets for analytics and analytics through optimal operating models.
● Provide day-to-day ownership and project management data operations activities including data quality/data management support cases and other ad-hoc requests.
● Create standards, frameworks for CI/CD pipelines and DevOps.
● Collaborative cross-functionally to develop and implement data operations policies balancing centralized control and standardization with decentralized speed and flexibility.
● Identify areas for improvement. Create procedures, teams, and policies to support near real-time clean data, where applicable, or in a batch and close process, where applicable.
● Improve processes by tactically focusing on business outcomes. Drive prioritization based on business needs and strategy.
● Lead and control workflow operations by driving critical issues and discussions with partners to identify and implement improvements.
● Responsible for defining, measuring, monitoring, and reporting of key SLA metrics to support its vision.

Experience, Education and Specialized Knowledge and Skills

Must thrive working in a fast-paced, innovative environment while remaining flexible, proactive, resourceful, and efficient. Strong interpersonal skills, ability to understand
stakeholder pain points, ability to analyze complex issues to develop relevant and realistic solutions and recommendations. Demonstrated ability to translate strategy into action; excellent technical skills and an ability to communicate complex issues in a simple way and to orchestrate solutions to resolve issues and mitigate risks.
Read more
They provide both wholesale and retail funding. (PM1)
They provide both wholesale and retail funding. (PM1)
Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
Teradata
Vertica
skill iconPython
DBA
Redshift
+8 more
  • Key responsibility is to design, develop & maintain efficient Data models for the organization maintained to ensure optimal query performance by the consumption layer.
  • Developing, Deploying & maintaining a repository of UDXs written in Java / Python.
  • Develop optimal Data Model design, analyzing complex distributed data deployments, and making recommendations to optimize performance basis data consumption patterns, performance expectations, the query is executed on the tables/databases, etc.
  • Periodic Database health check and maintenance
  • Designing collections in a no-SQL Database for efficient performance
  • Document & maintain data dictionary from various sources to enable data governance
  • Coordination with Business teams, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
  • Data Governance Process Implementation and ensuring data security

Requirements

  • Extensive working experience in Designing & Implementing Data models in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc).
  • Programming experience using Python / Java.
  • Working knowledge in developing & deploying User-defined Functions (UDXs) using Java / Python.
  • Strong understanding & extensive working experience in OLAP Data Warehousing (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) architecture and cloud-native Data Lake (S3, ADLS, BigQuery, etc) Architecture.
  • Strong knowledge in Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model.
  • Extensive technical experience in SQL including code optimization techniques.
  • Strung knowledge of database performance and tuning, troubleshooting, and tuning.
  • Knowledge of collection design in any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc), along with implementation of best practices.
  • Ability to understand business functionality, processes, and flows.
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
  • Any OLAP DWH DBA Experience and User Management will be added advantage.
  • Knowledge in financial industry-specific Data models such as FSLDM, IBM Financial Data Model, etc will be added advantage.
  • Experience in Snowflake will be added advantage.
  • Working experience in BFSI/NBFC & data understanding of Loan/Mortgage data will be added advantage.

Functional knowledge

  • Data Governance & Quality Assurance
  • Modern OLAP Database Architecture & Design
  • Linux
  • Data structures, algorithm & data modeling techniques
  • No-SQL database architecture
  • Data Security

 

Read more
Our Client company is into Computer Software. (EC1)
Our Client company is into Computer Software. (EC1)
Agency job
via Multi Recruit by Manjunath Multirecruit
Bengaluru (Bangalore)
12 - 14 yrs
₹34.8L - ₹35L / yr
Data Architect
skill iconData Science
Windows Azure
ADF
  • Establish and maintain a trusted advisor relationship within the company’s IT, Commercial Digital Solutions, Functions, and Businesses you interact with
  • Establish and maintain close working relationships with teams responsible for delivering solutions to the company’s businesses and functions
  • Perform key management and thought leadership in the areas of advanced data techniques, including data modeling, data access, data integration, data visualization, big data solutions, text mining, data discovery, statistical methods, and database design
  • Work with business partners to define ways to leverage data to develop platforms and solutions to drive business growth
  • Engage collaboratively with project teams to support project objectives through the application of sound data architectural principles; support a project with knowledge of existing data assets and provide guidance on reusable data structures
  • Share knowledge of external and internal data capabilities and trends, provide leadership, and facilitate the evaluation of vendors and products
  • Utilize advanced data analysis, including statistical analysis and data mining techniques
  • Collaborate with others to set an enterprise data vision with solid recommendations, and work to gain business and IT consensus

 

Basic Qualifications

 

  • Overall 10+ years of IT Environment Experience.
  • 3+ years of experience partnering with business managers to develop technical strategies and architectures to support their objectives
  • 3+ years in Azure Data Factory.
  • 2 +azure data bricks, azure cosmos DB, multi-factor authentication, event hub, azure active directory, logic apps.
  • 2+ years of hands-on experience with analytics deployment in the cloud (prefer Azure)
  • 5+ years of delivering analytics in modern data architecture (Hadoop, Massively Parallel Processing Database Platforms, and Semantic Modeling)
  • Demonstrable knowledge of ETL and ELT patterns and when to use either one; experience selecting among different tools that could be leveraged to accomplish this (Talend, Informatica, Azure Data Factory, SSIS, SAP Data Services)
  • Demonstrable knowledge of and experience with different scripting languages (python, JavaScript, PIG, or object-oriented programming like Java or . NET)

 

Preferred Qualifications

  • Bachelor’s degree in Computer Science, MIS, related field, or equivalent experience
  • Experience working with solutions delivery teams using Agile/Scrum or similar methodologies
  • 2+ years of experience designing solutions leveraging Microsoft Cortana Intelligence Suite of Products [Azure SQL, Azure SQL DW, Cosmos DB, HDInsight, DataBricks]
  • Experience with enterprise systems, like CRM, ERP, Field Services Management, Supply Chain solutions, HR systems
  • Ability to work independently, establishing strategic objectives, project plans, and milestones
  • Exceptional written, verbal & presentation skills

 

 

Read more
Our Client company is into Computer Software. (EC1)
Our Client company is into Computer Software. (EC1)
Agency job
via Multi Recruit by Fiona RKS
Bengaluru (Bangalore)
3 - 5 yrs
₹12L - ₹15L / yr
ETL
Snowflake
snow flake
Data engineering
SQL
+1 more
  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Snowflake Cloud Datawarehouse as well as SQL and Azure ‘big data’ technologies
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.

Basic Qualifications

  • 3+ years of experience in a Data Engineer or Software Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
  • Experience with “Snowflake Cloud Datawarehouse”
  • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
  • Experience with data pipeline and workflow management tools
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Understanding of Datawarehouse (DWH) systems, and migration from DWH to data lakes/Snowflake
  • Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Experience supporting and working with cross-functional teams in a dynamic environment.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos