Job Responsibilities/KRAs: Responsibilities Understand business requirement and actively provide inputs from Data perspective. Experience of SSIS development. Experience in Migrating SSIS packages to Azure SSIS Integrated Runtime Experience in Data Warehouse / Data mart development and migration Good knowledge and Experience on Azure Data Factory Expert level knowledge of SQL DB & Datawarehouse Should know at least one programming language (python or PowerShell) Should be able to analyse and understand complex data flows in SSIS. Knowledge on Control-M Knowledge of Azure data lake is required. Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision. Build simple to complex pipelines & dataflows. Work with other Azure stack modules like Azure Data Lakes, SQL DW, etc. Requirements Bachelor’s degree in Computer Science, Computer Engineering, or relevant field. A minimum of 5 years’ experience in a similar role. Strong knowledge of database structure systems and data mining. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills.
Must Have Skills:- Solid Knowledge on DWH, ETL and Big Data Concepts- Excellent SQL Skills (With knowledge of SQL Analytics Functions)- Working Experience on any ETL tool i.e. SSIS / Informatica- Working Experience on any Azure or AWS Big Data Tools.- Experience on Implementing Data Jobs (Batch / Real time Streaming)- Excellent written and verbal communication skills in English, Self-motivated with strong sense of ownership and Ready to learn new tools and technologiesPreferred Skills:- Experience on Py-Spark / Spark SQL- AWS Data Tools (AWS Glue, AWS Athena)- Azure Data Tools (Azure Databricks, Azure Data Factory)Other Skills:- Knowledge about Azure Blob, Azure File Storage, AWS S3, Elastic Search / Redis Search- Knowledge on domain/function (across pricing, promotions and assortment).- Implementation Experience on Schema and Data Validator framework (Python / Java / SQL),- Knowledge on DQS and MDM.Key Responsibilities:- Independently work on ETL / DWH / Big data Projects- Gather and process raw data at scale.- Design and develop data applications using selected tools and frameworks as required and requested.- Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.- Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.- Work closely with the engineering team to integrate your work into our production systems.- Process unstructured data into a form suitable for analysis.- Analyse processed data.- Support business decisions with ad hoc analysis as needed.- Monitoring data performance and modifying infrastructure as needed.Responsibility: Smart Resource, having excellent communication skills
1) 6-9 years of industry experience and at least 4 years of experience in an architect role is required, along with at least 3-5 year experience in designing and building analytics/data solutions in Azure. 2) Demonstrated in-depth skills with Azure Data Factory(ADF),Azure SQL Server, Azure Synapse, ADLS with the ability to configure and administrate all aspects of Azure SQL Server. 3) Demonstrated experience delivering multiple data solutions as an architect. 4) Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and related Microsoft and other ETL technologies 5) DP-200 and DP-201 certifications preferred. 6) Good to have hands on experience in Power BI and Azure Databricks. 7)Should have good communication and presentation skills.
Required Skills and Experience• • General or Strong IT background, with at least 2 to 4 years of working experience• o Strong understanding of data integration and ETL methodologies.• o Demonstrated ability to multi-task• o Excellent English communication skills• o A desire to be a part of growing company. You'll have 2 core responsibilities (Client Work, and Company Building), and we expect dedication to both.• o Willingness to learn and work on new technologies.• o Should be a quick and self-learner.Tools: 1. Good Knowledge of Power Bi and Tableau2. Good experience in handling data in Excel.
The Data Engineer is at the core of our customer’s success. We are looking for talented andexperienced engineers to join our team. If you are a problem solver and enjoy working on an excitingand fast-paced environment, this may be the perfect opportunity for you to join us and take yourcareer to the next level.Responsibilities• • Communicate effectively with customers, including expectations for callbacks and followupon their issues.• • Handle technical issues of different complexity and help other members of the team.• • Collaborate with other support teams, Engineering and other internal departments to helpresolve critical issues.• • Cross train on multiple technologies to effectively build and support product/technologyportfolio.• • Troubleshoot, diagnose & resolve customer issues independently, making use of theresources available to you.• • Keep all the ongoing cases documented and up to date in the case management system.• • Promote and maintain a high quality, professional, service orientated DigiTop imageamongst internal and external customers.• • Works in adherence to defined processes and procedures implemented in the organization• • Maintain and continually upgrade technical understanding of products and technologies.• • Build and maintain solid working relationships with the members of the team.• • Write and/or edit knowledge articles for every issue resolved.Required Skills and Experience• • General o Strong IT background, with at least 2 to 4 years of working experience• o Strong understanding of data integration and ETL methodologies.• o Degree in Computer Science or equivalent experience.• o Demonstrated ability to multi-task• o Excellent English communication skills• o A desire to be a part of growing company. You'll have 2 core responsibilities (Client Work,and Company Building), and we expect dedication to both.• o Willingness to learn and work on new technologies.• o Should be a quick and self-learner.• • Technical o Extensive experience using ETL methodology for supporting Data Extraction,Transformations, Data validation and Loading.• o Experience in Database skills (load data from Excel/CSV/Text files to staging, write sqlstatements/functions/ stored procedures, run SSIS packages).• o Knowledge of integrating with different data sources such as SQL, Web Services, API’s,text files and CSV is mandatory• o Extensive experience troubleshooting and solving complextechnical problems.• o Experience in at-least one of the ETL Tools (SSIS).• o Knowledge of other ETL Tools such as Snaplogic, IBM Data Stage, Mulesoft andInformatica is a plus.• o Knowledge of different/diversified technologies will be added advantage. Additionalweightage will be given.• o Experience with Visualization tools Power BI, Qlik Sense andTableau is preferred
Minimum of 4 years’ experience of working on DW/ETL projects and expert hands-on working knowledge of ETL tools. Experience with Data Management & data warehouse development Star schemas, Data Vaults, RDBMS, and ODS Change Data capture Slowly changing dimensions Data governance Data quality Partitioning and tuning Data Stewardship Survivorship Fuzzy Matching Concurrency Vertical and horizontal scaling ELT, ETL Spark, Hadoop, MPP, RDBMS Experience with Dev/OPS architecture, implementation and operation Hand's on working knowledge of Unix/Linux Building Complex SQL Queries. Expert SQL and data analysis skills, ability to debug and fix data issue. Complex ETL program design coding Experience in Shell Scripting, Batch Scripting. Good communication (oral & written) and inter-personal skills Expert SQL and data analysis skill, ability to debug and fix data issue Work closely with business teams to understand their business needs and participate in requirements gathering, while creating artifacts and seek business approval. Helping business define new requirements, Participating in End user meetings to derive and define the business requirement, propose cost effective solutions for data analytics and familiarize the team with the customer needs, specifications, design targets & techniques to support task performance and delivery. Propose good design & solutions and adherence to the best Design & Standard practices. Review & Propose industry best tools & technology for ever changing business rules and data set. Conduct Proof of Concepts (POC) with new tools & technologies to derive convincing benchmarks. Prepare the plan, design and document the architecture, High-Level Topology Design, Functional Design, and review the same with customer IT managers and provide detailed knowledge to the development team to familiarize them with customer requirements, specifications, design standards and techniques. Review code developed by other programmers, mentor, guide and monitor their work ensuring adherence to programming and documentation policies. Work with functional business analysts to ensure that application programs are functioning as defined. Capture user-feedback/comments on the delivered systems and document it for the client and project manager’s review. Review all deliverables before final delivery to client for quality adherence. Technologies (Select based on requirement) Databases - Oracle, Teradata, Postgres, SQL Server, Big Data, Snowflake, or Redshift Tools – Talend, Informatica, SSIS, Matillion, Glue, or Azure Data Factory Utilities for bulk loading and extracting Languages – SQL, PL-SQL, T-SQL, Python, Java, or Scala J/ODBC, JSON Data Virtualization Data services development Service Delivery - REST, Web Services Data Virtualization Delivery – Denodo ELT, ETL Cloud certification Azure Complex SQL Queries Data Ingestion, Data Modeling (Domain), Consumption(RDMS)