Work Days: Sunday through ThursdayWeek off: Friday & SaurdayDay Shift.Key responsibilities: Creating, designing and developing data models Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures Validating results and creating business reports Monitoring and tuning data loads and queries Develop and prepare a schedule for a new data warehouse Analyze large databases and recommend appropriate optimization for the same Administer all requirements and design various functional specifications for data Provide support to the Software Development Life cycle Prepare various code designs and ensure efficient implementation of the same Evaluate all codes and ensure the quality of all project deliverables Monitor data warehouse work and provide subject matter expertise Hands-on BI practices, data structures, data modeling, SQL skills Hard Skills for a Data Warehouse Developer: Hands-on experience with ETL tools e.g., DataStage, Informatica, Pentaho, Talend Sound knowledge of SQL Experience with SQL databases such as Oracle, DB2, and SQL Experience using Data Warehouse platforms e.g., SAP, Birst Experience designing, developing, and implementing Data Warehouse solutions Project management and system development methodology Ability to proactively research solutions and best practices Soft Skills for Data Warehouse Developers: Excellent Analytical skills Excellent verbal and written communications Strong organization skills Ability to work on a team, as well as independently
Exp : 1 to 3 yrs Edu : BE/B.tech/M.Tech Work Location : ITPL Bangalore Job Description: 1-3 years of experience implementing RDBMS, data warehouse or other types of data projects. Strong troubleshooting skills and the ability to bring problems to resolution. Strong Oracle PL/SQL Programming (PL/SQL Blocks, Procedures, Functions, Packages, using database supplied packages).- Must have Ability to write complex SQL statements for data validation, troubleshooting, and implementation of custom business processes.-Must have Experience working in a Data warehousing environment, familiar with ETL concepts.- Must have Excellent written and verbal communication skills. Must be fluent in English.-Must have Understanding of devops and cloud - AWS/Google/Azure is a plus.
Required: 5-10 years of experience in BI, Reporting & Data Warehousing domain Experience in production support for Data queries - monitoring, analysis & triage of issues Expertise in doing RCA (root-cause analysis) and collaborating with development teams for CoE (correction of errors). Experience in using BI tools like MicroStrategy, Qlik, Power BI, Business Objects Expertise in data-analysis & writing SQL queries to provide insights into the production data. Good communication & collaboration skills - liaison with product, operations & business teams to understand the requirements and provide data extracts & reports on need basis. Experience in working in an enterprise environment, with a good discipline & adherence to the SLA. Good understanding of the ticketing tools, to track the various requests and manage the lifecycle for multiple requests e.g. JIRA, Service-Now, Rally, Change-Gear etc. Orientation towards addressing the root-cause for any issue i.e. collaborate and follow-up with development teams to ensure permanent fix & prevention is given high priority. Ability to create SOPs (system operating procedures) in Confluence/Wiki to ensure there is a good reference for the support team to utilise. Experience with relational database (RDBMS) & data-mart technologies like DB2, RedShift, SQL Server, My SQL, Netezza etc. Ability to monitor ETL jobs in AWS stack with tools like Tidal, Autosys etc. Self-starter and a collaborator having the ability to independently acquire the knowledge required in succeeding the job. Experience with Big data platforms like Amazon RedShift Ability to mentor & lead Data Ops team-members for high quality of customer experience and resolution of issues on timely basis. Adherence to process orientation. Responsibilities: Address queries for existing reports Ad-hoc data requests for product & business stakeholders: Transactions per day, per entity (merchant, card-type, card-category) Custom extracts Production Support Job failures resolution - re-runs based on SOPs Report failures resolution Resolution of the issue Root-cause Analysis & handover to Dev team Ability to track & report the health of the system:\ Create matrix for issue volume Coordinate and setup an escalation workflow Provide status-reports on regular basis for stakeholders review
About BlackHawk Network:Blackhawk Network is building a digital platform and products that bring people and brands together. We facilitate cross channel payments via cash-in, cash-out and mobile payments. By leveraging blockchain, smart contracts, serverless technology, real time payment systems, we are unlocking the next million users through innovation. Our employees are our biggest assets! Come find out how we engage, with the biggest brands in the world. We look for people who collaborate, who are inspirational, who have passion that can make a difference by working as a team while striving for global excellence. You can expect a strong investment in your professional growth, and a dedication to crafting a successful, sustainable career for you. Our teams are composed of highly talented and passionate 'A' players, who are also invested in mentoring and enabling the best qualities. Our vibrant culture and high expectations will kindle your passion and bring out the best in you! As a leader in branded payments, we are building a strong diverse team and expanding in ASIA PACIFIC –we are hiring in Bengaluru, India! This is an amazing opportunity for problem solvers who want to be a part of an innovative and creative Engineering team that values your contribution to the company. If this role has your name written all over it, please contact us apply now with a resume so that we explore further and get connected. If you enjoy building world class payment applications, are highly passionate about pushing the boundaries of scale and availability on the cloud, leveraging the next horizon technologies, rapidly deliver features to production, make data driven decisions on product development, collaborate and innovate with like-minded experts, then this would be your ideal job. Blackhawk is seeking passionate backend engineers at all levels to build our next generation of payment systems on a public cloud infrastructure. Our team enjoys working together to contribute to meaningful work seen by millions of merchants worldwide.As a Senior SDET, you will work closely with data engineers to automate developed features and manual testing of the new data ETL Jobs, Data pipelines and Reports. You will be responsible for owning the complete architecture of automation framework and planning and designing automation for data ingestion, transformation and Reporting/Visualization. You will be building high-quality automation frameworks to cover end to end testing of the data platforms and ensure test data setup and pre-empt post production issues by high quality testing in the lower environments. You will get an opportunity to contribute at all levels of the test pyramid. You will also work with customer success and product teams to replicate post-production release issues. Key Qualifications Bachelor’s degree in Computer Science, Engineering or related fields 5+ years of experience testing data ingestion, visualization and info delivery systems. Real passion for data quality, reconciliation and uncovering hard to find scenarios and bugs. Proficiency in at least one programming language (preferably Python/Java) Expertise in end to end ETL (E.g. DataStage, Matillion) and BI platforms (E.g. MicroStrategy, PowerBI) testing and data validation Experience working with big data technologies such as Hadoop and MapReduce is desirable Excellent analytical, problem solving and communication skills. Self-motivated, results oriented and deadline driven. Experience with databases and data visualization and dashboarding tools would be desirable Experience working with Amazon Web Services (AWS) and Redshift is desirable Excellent knowledge of Software development lifecycle, testing Methodologies, QA terminology, processes, and tools Experience with automation using automation frameworks and tools, such as TestNG, JUnit and Selenium
Job Description We are looking for a Data Engineer that will be responsible for collecting, storing, processing, and analyzing huge sets of data that is coming from different sources. Responsibilities Working with Big Data tools and frameworks to provide requested capabilities Identify development needs in order to improve and streamline operations Develop and manage BI solutions Implementing ETL process and Data Warehousing Monitoring performance and managing infrastructure Skills Proficient understanding of distributed computing principles Proficiency with Hadoop and Spark Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Good knowledge of Data querying tools SQL and Hive Knowledge of various ETL techniques and frameworks Experience with Python/Java/Scala (at least one) Experience with cloud services such as AWS or GCP Experience with NoSQL databases, such as DynamoDB,MongoDB will be an advantage Excellent written and verbal communication skills
Strong exposure in ETL / Big Data / Talend / Hadoop / Spark / Hive / Pig To be considered as a candidate for a Senior Data Engineer position, a person must have a proven track record of architecting data solutions on current and advanced technical platforms. They must have leadership abilities to lead a team providing data centric solutions with best practices and modern technologies in mind. They look to build collaborative relationships across all levels of the business and the IT organization. They possess analytic and problem-solving skills and have the ability to research and provide appropriate guidance for synthesizing complex information and extract business value. Have the intellectual curiosity and ability to deliver solutions with creativity and quality. Effectively work with business and customers to obtain business value for the requested work. Able to communicate technical results to both technical and non-technical users using effective story telling techniques and visualizations. Demonstrated ability to perform high quality work with innovation both independently and collaboratively.
Exciting opportunity for any contractor to work with a start-up firm which is into product cum service based industry. We are looking for someone who has got rich experience in below mentioned skills to join us immediately.This role is for 1 month where the person will be working from Client site in Paris to understand the system architecture and documenting them. Contract extension for this role will be purely on the performance of individual. Since the requirement is immediate and critical, we need someone who can join us soon and travel to Paris in December- Hands on experience handling multiple data sources/datasets- experience in data/BI architect role- Expert on SSIS, SSRS, SSAS- Should have knowledge writing MDX queries- Technical document preparation- Should have excellent communication- Process oriented- Strong project management- Should be able to think Out of the Box and provide ideas to have better solutions- Outstanding team player with positive attitude
Who we are? Searce is a Cloud, Automation & Analytics led business transformation company focussed on helping futurify businesses. We help our clients become successful by helping reimagine ‘what's next’ and then enabling them to realize that ‘now’. We processify, saasify, innovify & futurify businesses by leveraging Cloud | Analytics | Automation | BPM. What we believe? Best practices are overrated Implementing best practices can only make one ‘average’. Honesty and Transparency We believe in naked truth. We do what we tell and tell what we do. Client Partnership Client - Vendor relationship: No. We partner with clients instead. And our sales team comprises of 100% of our clients. How we work? It’s all about being Happier first. And rest follows. Searce work culture is defined by HAPPIER. Humble: Happy people don’t carry ego around. We listen to understand; not to respond. Adaptable: We are comfortable with uncertainty. And we accept changes well. As that’s what life's about. Positive: We are super positive about work & life in general. We love to forget and forgive. We don’t hold grudges. We don’t have time or adequate space for it. Passionate: We are as passionate about the great vada-pao vendor across the street as about Tesla’s new model and so on. Passion is what drives us to work and makes us deliver the quality we deliver. Innovative: Innovate or Die. We love to challenge the status quo. Experimental: We encourage curiosity & making mistakes. Responsible: Driven. Self-motivated. Self-governing teams. We own it. We welcome *really unconventional* creative thinkers who can work in an agile, flexible environment. We are a flat organization with unlimited growth opportunities, and small team sizes – wherein flexibility is a must, mistakes are encouraged, creativity is rewarded, and excitement is required. Introduction When was the last time you thought about rebuilding your smartphone charger using solar panels on your backpack OR changed the sequencing of switches in your bedroom (on your own, of course) to make it more meaningful OR pointed out an engineering flaw in the sequencing of traffic signal lights to a fellow passenger, while he gave you a blank look? If the last time this happened was more than 6 months ago, you are a dinosaur for our needs. If it was less than 6 months ago, did you act on it? If yes, then let’s talk. We are quite keen to meet you if: You eat, dream, sleep and play with Cloud Data Store & engineering your processes on cloud architecture You have an insatiable thirst for exploring improvements, optimizing processes, and motivating people. You like experimenting, taking risks and thinking big. 3 things this position is NOT about: This is NOT just a job; this is a passionate hobby for the right kind. This is NOT a boxed position. You will code, clean, test, build and recruit and you will feel that this is not really ‘work’. This is NOT a position for people who like to spend time on talking more than the time they spend doing. 3 things this position IS about: Attention to detail matters. Roles, titles, the ego does not matter; getting things done matters; getting things done quicker and better matters the most. Are you passionate about learning new domains & architecting solutions that could save a company millions of dollars? Roles and Responsibilities Drive and define database design and development of real-time complex products. Strive for excellence in customer experience, technology, methodology, and execution. Define and own end-to-end Architecture from definition phase to go-live phase. Define reusable components/frameworks, common schemas, standards to be used & tools to be used and help bootstrap the engineering team. Performance tuning of application and database and code optimizations. Define database strategy, database design & development standards and SDLC, database customization & extension patterns, database deployment and upgrade methods, database integration patterns, and data governance policies. Architect and develop database schema, indexing strategies, views, and stored procedures for Cloud applications. Assist in defining scope and sizing of work; analyze and derive NFRs, participate in proof of concept development. Contribute to innovation and continuous enhancement of the platform. Define and implement a strategy for data services to be used by Cloud and web-based applications. Improve the performance, availability, and scalability of the physical database, including database access layer, database calls, and SQL statements. Design robust cloud management implementations including orchestration and catalog capabilities. Architect and design distributed data processing solutions using big data technologies - added advantage. Demonstrate thought leadership in cloud computing across multiple channels and become a trusted advisor to decision-makers. Desired Skills Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform. Hands-on experience in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc. Knowledge of NoSQL stores is a plus). Knowledge of other transactional Database Management Systems/Open database system and NoSQL database (MongoDB, Cassandra, Hbase etc.) is a plus. Good knowledge of data management principles like Data Architecture, Data Governance, Very Large Database Design (VLDB), Distributed Database Design, Data Replication, and High Availability. Must have experience in designing large-scale, highly available, fault-tolerant OLTP data management systems. Solid knowledge of any one of the industry-leading RDBMS like Oracle/SQL Server/DB2/MySQL etc. Expertise in providing data architecture solutions and recommendations that are technology-neutral. Experience in Architecture consulting engagements is a plus. Deep understanding of technical and functional designs for Databases, Data Warehousing, Reporting, and Data Mining areas. Education & Experience Bachelors in Engineering or Computer Science (preferably from a premier School) - Advanced degree in Engineering, Mathematics, Computer or Information Technology. Highly analytical aptitude and a strong ‘desire to deliver’ outlives those fancy degrees! More so if you have been a techie from 12. 2-5 years of experience in database design & development 0- Years experience of AWS or Google Cloud Platform or Hadoop experience Experience working in a hands-on, fast-paced, creative entrepreneurial environment in a cross-functional capacity.
Critical Tasks and Expected Contributions/Results : The role will be primarily focused on the design, development and testing of ETL workflows (using Talend) as well as the batch management and error handling processes. Build Business Intelligence Applications using tools like Power BI. Additional responsibilities include the documentation of technical specifications and related project artefacts. - Gather requirement and propose possible ETL solutions for in-house designed Data Warehouse - Analyze & translate functional specifications & change requests into technical specifications. - Design and Creating star schema data models - Design, Build and Implement Business Intelligence Solutions using Power BI - Develop, implement & test ETL program logic. - Deployment and support any related issues Key Competency : - A good understanding of the concepts and best practices of data warehouse ETL design and be able to apply these suitably to solve specific business needs. - Expert knowledge of ETL tool like Talend - Have more than 8 years experience in designing and developing ETL work packages, and be able to demonstrate expertise in ETL tool- Talend - Knowledge of BI tools like Power BI is required - Ability to follow functional ETL specifications and challenge business logic and schema design where appropriate, as well as manage their time effectively. - Exposure to Performance tuning is essential - Good organisational skills. - Methodical and structured approach to design and development. - Good interpersonal skills.
candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts