5+ Snow flake schema Jobs in Pune | Snow flake schema Job openings in Pune
Apply to 5+ Snow flake schema Jobs in Pune on CutShort.io. Explore the latest Snow flake schema Job opportunities across top companies like Google, Amazon & Adobe.
Job Description for QA Engineer:
- 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
Skills & Experience:
❖ At least 5+ years of experience as a Data Engineer
❖ Hands-on and in-depth experience with Star / Snowflake schema design, data modeling,
data pipelining and MLOps.
❖ Experience in Data Warehouse technologies (e.g. Snowflake, AWS Redshift, etc)
❖ Experience in AWS data pipelines (Lambda, AWS glue, Step functions, etc)
❖ Proficient in SQL
❖ At least one major programming language (Python / Java)
❖ Experience with Data Analysis Tools such as Looker or Tableau
❖ Experience with Pandas, Numpy, Scikit-learn, and Jupyter notebooks preferred
❖ Familiarity with Git, GitHub, and JIRA.
❖ Ability to locate & resolve data quality issues
❖ Ability to demonstrate end to ed data platform support experience
Other Skills:
❖ Individual contributor
❖ Hands-on with the programming
❖ Strong analytical and problem solving skills with meticulous attention to detail
❖ A positive mindset and can-do attitude
❖ To be a great team player
❖ To have an eye for detail
❖ Looking for opportunities to simplify, automate tasks, and build reusable components.
❖ Ability to judge suitability of new technologies for solving business problems
❖ Build strong relationships with analysts, business, and engineering stakeholders
❖ Task Prioritization
❖ Familiar with agile methodologies.
❖ Fintech or Financial services industry experience
❖ Eagerness to learn, about the Private Equity/Venture Capital ecosystem and associated
secondary market
Responsibilities:
o Design, develop and maintain a data platform that is accurate, secure, available, and fast.
o Engineer efficient, adaptable, and scalable data pipelines to process data.
o Integrate and maintain a variety of data sources: different databases, APIs, SAASs, files, logs,
events, etc.
o Create standardized datasets to service a wide variety of use cases.
o Develop subject-matter expertise in tables, systems, and processes.
o Partner with product and engineering to ensure product changes integrate well with the
data platform.
o Partner with diverse stakeholder teams, understand their challenges and empower them
with data solutions to meet their goals.
o Perform data quality on data sources and automate and maintain a quality control
capability.
at AxionConnect Infosolutions Pvt Ltd
Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur
Notice period: Immediate - 15 days
1. Python Developer with Snowflake
Job Description :
- 5.5+ years of Strong Python Development Experience with Snowflake.
- Strong hands of experience with SQL ability to write complex queries.
- Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
- Development of Data Analysis, Data Processing engines using Python
- Good Experience in Data Transformation using Python.
- Experience in Snowflake data load using Python.
- Experience in creating user-defined functions in Snowflake.
- Snowsql implementation.
- Knowledge of query performance tuning will be added advantage.
- Good understanding of Datawarehouse (DWH) concepts.
- Interpret/analyze business requirements & functional specification
- Good to have DBT, FiveTran, and AWS Knowledge.
We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction