About BlackHawk Network:Blackhawk Network is building a digital platform and products that bring people and brands together. We facilitate cross channel payments via cash-in, cash-out and mobile payments. By leveraging blockchain, smart contracts, serverless technology, real time payment systems, we are unlocking the next million users through innovation. Our employees are our biggest assets! Come find out how we engage, with the biggest brands in the world. We look for people who collaborate, who are inspirational, who have passion that can make a difference by working as a team while striving for global excellence. You can expect a strong investment in your professional growth, and a dedication to crafting a successful, sustainable career for you. Our teams are composed of highly talented and passionate 'A' players, who are also invested in mentoring and enabling the best qualities. Our vibrant culture and high expectations will kindle your passion and bring out the best in you! As a leader in branded payments, we are building a strong diverse team and expanding in ASIA PACIFIC –we are hiring in Bengaluru, India! This is an amazing opportunity for problem solvers who want to be a part of an innovative and creative Engineering team that values your contribution to the company. If this role has your name written all over it, please contact us apply now with a resume so that we explore further and get connected. If you enjoy building world class payment applications, are highly passionate about pushing the boundaries of scale and availability on the cloud, leveraging the next horizon technologies, rapidly deliver features to production, make data driven decisions on product development, collaborate and innovate with like-minded experts, then this would be your ideal job. Blackhawk is seeking passionate backend engineers at all levels to build our next generation of payment systems on a public cloud infrastructure. Our team enjoys working together to contribute to meaningful work seen by millions of merchants worldwide.As a Senior SDET, you will work closely with data engineers to automate developed features and manual testing of the new data ETL Jobs, Data pipelines and Reports. You will be responsible for owning the complete architecture of automation framework and planning and designing automation for data ingestion, transformation and Reporting/Visualization. You will be building high-quality automation frameworks to cover end to end testing of the data platforms and ensure test data setup and pre-empt post production issues by high quality testing in the lower environments. You will get an opportunity to contribute at all levels of the test pyramid. You will also work with customer success and product teams to replicate post-production release issues. Key Qualifications Bachelor’s degree in Computer Science, Engineering or related fields 5+ years of experience testing data ingestion, visualization and info delivery systems. Real passion for data quality, reconciliation and uncovering hard to find scenarios and bugs. Proficiency in at least one programming language (preferably Python/Java) Expertise in end to end ETL (E.g. DataStage, Matillion) and BI platforms (E.g. MicroStrategy, PowerBI) testing and data validation Experience working with big data technologies such as Hadoop and MapReduce is desirable Excellent analytical, problem solving and communication skills. Self-motivated, results oriented and deadline driven. Experience with databases and data visualization and dashboarding tools would be desirable Experience working with Amazon Web Services (AWS) and Redshift is desirable Excellent knowledge of Software development lifecycle, testing Methodologies, QA terminology, processes, and tools Experience with automation using automation frameworks and tools, such as TestNG, JUnit and Selenium
Key Responsibilities : - Leverage the batch computation frameworks and our workflow management platform (Airflow) to assist in building out different data pipelines - Lower the latency and bridge the gap between our production systems and our data warehouse by rethinking and optimizing our core data pipeline jobs - Work with client to create and optimize critical batch processing jobs in Spark - Develop production grade code using Scala/Spark and Python/Spark code on Azure data bricks Skills and Experience : - Strong engineering background and interested in data - Good understanding of data analysis using SQL queries - Strong hold on Python or Scala as a programming language on Azure Databricks. - Experience of developing and maintaining distributed systems built with Azure Databricks or native Apache Spark - Experience of building libraries and tooling that provide abstractions to users for accessing data - Experience in writing and debugging ETL jobs using a distributed data framework (Spark/Hadoop MapReduce etc.) on Azure Databricks - Experience optimizing the end-to-end performance of distributed systems - Ability to recommend and implement ways to improve data reliability, efficiency, and quality.
Position 1 : Azure Data Engineer Exp 3 to 5 years Must have tech stack – Azure Platform + Spark + Scala Nice to have – Power BI, Azure Certification Budget – 14 LPA Notice period – within 30 days only Position 2 : Lead Azure Data Engineer Exp 6 to 8.5 years (can go up to 9.5 if it falls in the budget) Must have tech stack – Azure Platform + Spark + Scala + Power BI Other experience – Senior data engineer (1+years) / Lead Engineer / Architect Nice to have – Azure Certification Budget – 20 LPA Notice period – within 30 days only
Work Location: Whitefield, BangaloreWork Days: Sunday to FridayShift: Day TimeWeek Off: Friday & Saturday Key Skills:• Strong knowledge in BI and Data Visualization Experience in Power BI (DAX + Power Query + Power BI Service + Power BI • Should have experience in Power BI mobile Dashboards.• Strong knowledge in SQL.• Good knowledge of DWH concepts.• Work as an independent contributor at the client location.• Implementing Access Control and impose required Security.• Candidate must have very good communication skills.
Rorko is looking for a Data Visualization Engineer experienced with up to 1 year of experience in relevant fields. The candidate should have the ability to represent data in a manner that non-technical people can understand. They should be able to create dynamic data visualizations to help our clients make meaningful decisions in an interactive, web-based format.
- Prior experience in Business Analytics and knowledge of related analysis or visualization tools- Expecting a minimum of 2-4 years of relevant experience- You will be managing a team of 3 currently- Take up the ownership of developing and managing one of the largest and richest food (recipe, menu, and CPG) databases- Interactions with cross-functional teams (Business, Food Science, Product, and Tech) on a regular basis to pan the future of client and internal food data management- Should have a natural flair for playing with numbers and data and have a keen eye for detail and quality- Will spearhead the Ops team in achieving the targets while maintaining a staunch attentiveness to Coverage, Completeness, and Quality of the data- Shall program and manage projects while identifying opportunities to optimize costs and processes.- Good business acumen, in creating logic & process flows, quick and smart decision-making skills are expected- Will also be responsible for the recruitment, induction and training new members as well- Setting competitive team targets. Guide and support the team members to go the extra mile and achieve set targetsAdded Advantages :- Experience in a Food Sector / Insights company- Has a passion for exploring different cuisines- Understands industry-related jargons and has a natural flair towards learning more about anything related to food
Hi All Greetings from CareerNet Technologies ! Its pleasure talking to you. Please find below the details: Please find below the details: Role: Power BI Developer Company: KOCH (https://www.kochind.com) Type: Permanent (Direct payroll) Edu: Any Full time Graduates Exp : 4+ YrsJob Location: Kundalahalli,Near Brookefield Hospital, Bangalore -560037 And as discussed, PFA the JDs and company details and it's principles. Job Description: 3+ years’ experience developing and implementing enterprise-scale reports and dashboards. Proficiency with MS Power BI / SSRS. Knowledge of logical and physical data modeling concepts (relational and dimensional). Understanding of structured query language (SQL).
Power BI Experience: 2-3 years Skills: Power BI Start date : Looking out for Immediate Joiners Contract position for 3 months Location Bangalore Job Designation Power BI Developer : 1. To Develop Power BI reports & effective dashboards after gathering and translating end user requirements. 2. To have Ability to turn large amounts of raw data into actionable information. 3. Be responsible for the performance and usability of the system 4. To Perform optimization of MS SQL Queries and SSIS packages 5. To Provide solutions using SSAS cubes (Tabular/Multidimensional Models) 6. Must have excellent experience in MS SQL Server - writing queries, joints, stored procedures, functions etc Candidate should have the knowledge of: Power BI with DAX query, Multiple Data Sources and Relationships of data Power BI Architecture Data modelling, Calculations, Conversions, Scheduling Data refreshes in Power BI Desktop Hands-on professional with thorough knowledge of scripting, data source integration and advanced GUI development in Power BI. Familiar with Power BI custom add-on development & deployment. Connecting Power BI with on-premise and cloud computing platforms. To provide optimized SQL Scripts for complex scenarios Business Intelligence Systems & Tools (Microsoft Power BI, SQL Server Reporting Services) Good understanding of the processes of data quality, data cleansing and data transformation
Critical Tasks and Expected Contributions/Results : The role will be primarily focused on the design, development and testing of ETL workflows (using Talend) as well as the batch management and error handling processes. Build Business Intelligence Applications using tools like Power BI. Additional responsibilities include the documentation of technical specifications and related project artefacts. - Gather requirement and propose possible ETL solutions for in-house designed Data Warehouse - Analyze & translate functional specifications & change requests into technical specifications. - Design and Creating star schema data models - Design, Build and Implement Business Intelligence Solutions using Power BI - Develop, implement & test ETL program logic. - Deployment and support any related issues Key Competency : - A good understanding of the concepts and best practices of data warehouse ETL design and be able to apply these suitably to solve specific business needs. - Expert knowledge of ETL tool like Talend - Have more than 8 years experience in designing and developing ETL work packages, and be able to demonstrate expertise in ETL tool- Talend - Knowledge of BI tools like Power BI is required - Ability to follow functional ETL specifications and challenge business logic and schema design where appropriate, as well as manage their time effectively. - Exposure to Performance tuning is essential - Good organisational skills. - Methodical and structured approach to design and development. - Good interpersonal skills.