Responsibility of / Expectations from the Role
Assist in the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on Azure.
Profound experience in designing and developing data integration solutions using ETL tools such as DBT.
Hands-on experience in the implementation of cloud data warehouses using Snowflake & Azure Data Factory
Solid MS SQL Server skills including reporting experience.
Work closely with product managers and engineers to design, implement, test, and continually improve scalable data solutions and services running on DBT & Snowflake cloud platforms.
Implement critical and non-critical system data integration and ingestion fixes for the data platform and environment.
Ensuring root cause resolution to identified problems.
Monitor and support the Data Solutions jobs and processes to meet the daily SLA.
Analyze the current analytics environment and make recommendations for appropriate data warehouse modernization and migration to the cloud.
Develop Snowflake deployment (Using Azure DevOPS or similar CI/CD tool) and usage best practices.
Follow best practices and standards around data governance, security and privacy.
Comfortable working in a fast-paced team environment coordinating multiple projects.
Effective software development life cycle management skills and experience with GitHub
Leverage tools like Fivetran, DBT, Snowflake, GitHub, to drive ETL, data modeling and analytics.
Data transformation and Data Analytics Documentation

About Tata Consultancy Services
Similar jobs

● 5+ years of experience as a Data engineer or related role.
● 5+ years of experience in application development using Python
● Strong experience with SQL and good to have NoSQL.
● Experience with Agile engineering practices.
● Preferred experience in writing queries for RDBMS, cloud-based data warehousing solutions like
Snowflake.
● Ability to work independently or as part of a team.
● Experience with cloud platforms, preferably AWS, is good to have
● Experience with ETL/LT tools and methodologies.
● Experience working on real-time Data Streaming and Data Streaming platforms
Responsibilities include:
- Develop and maintain data validation logic in our proprietary Control Framework tool
- Actively participate in business requirement elaboration and functional design sessions to develop an understanding of our Operational teams’ analytical needs, key data flows and sources
- Assist Operational teams in the buildout of Checklists and event monitoring workflows within our Enterprise Control Framework platform
- Build effective working relationships with Operational users, Reporting and IT development teams and business partners across the organization
- Conduct interviews, generate user stories, develop scenarios and workflow analyses
- Contribute to the definition of reporting solutions that empower Operational teams to make immediate decisions as to the best course of action
- Perform some business user acceptance testing
- Provide production support and troubleshooting for existing operational dashboards
- Conduct regular demos and training of new features for the stakeholder community
Qualifications
- Bachelor’s degree or equivalent in Business, Accounting, Finance, MIS, Information Technology or related field of study
- Minimum 5 years’ of SQL required
- Experience querying data on cloud platforms (AWS/ Azure/ Snowflake) required
- Exceptional problem solving and analytical skills, attention to detail and organization
- Able to independently troubleshoot and gather supporting evidence
- Prior experience developing within a BI reporting tool (e.g. Spotfire, Tableau, Looker, Information Builders) a plus
- Database Management and ETL development experience a plus
- Self-motivated, self-assured, and self-managed
- Able to multi-task to meet time-driven goals
- Asset management experience, including investment operation a plus
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite

Striim (pronounced “stream” with two i’s for integration and intelligence) was founded in 2012 with a simple goal of helping companies make data useful the instant it’s born.
Striim’s enterprise-grade, streaming integration with intelligence platform makes it easy to build continuous, streaming data pipelines – including change data capture (CDC) – to power real-time cloud integration, log correlation, edge processing, and streaming analytics.
Strong Core Java / C++ experience
· Excellent understanding of Logical ,Object-oriented design patterns, algorithms and data structures.
· Sound knowledge of application access methods including authentication mechanisms, API quota limits, as well as different endpoint REST, Java etc
· Strong exp in databases - not just a SQL Programmer but with knowledge of DB internals
· Sound knowledge of Cloud database available as service is plus (RDS, CloudSQL, Google BigQuery, Snowflake )
· Experience working in any cloud environment and microservices based architecture utilizing GCP, Kubernetes, Docker, CircleCI, Azure or similar technologies
· Experience in Application verticals such as ERP, CRM, Sales with applications such as Salesforce, Workday, SAP < Not Mandatory - added advantage >
· Experience in building distributed systems < Not Mandatory - added advantage >
· Expertise on Data warehouse < Not Mandatory - added advantage >
· Exp in developing & delivering product as SaaS i< Not Mandatory - added advantage

