Loading...

{{notif_text}}

Last chance to connect with exciting companies hiring right now - Register now!|L I V E{{days_remaining}} days {{hours_remaining}} hours left!

ETL Jobs in Bangalore (Bengaluru)

Explore top ETL Job opportunities in Bangalore (Bengaluru) for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Data Architect

Founded 2004
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, Bengaluru (Bangalore)
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1100000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Who we are? Searce is  a niche’ Cloud Consulting business with futuristic tech DNA. We do new-age tech to realise the “Next” in the “Now” for our Clients. We specialise in Cloud Data Engineering, AI/Machine Learning and Advanced Cloud infra tech such as Anthos and Kubernetes. We are one of the top & the fastest growing partners for Google Cloud and AWS globally with over 2,500 clients successfully moved to cloud. What we believe? Best practices are overrated Implementing best practices can only make one n . Honesty and Transparency We believe in naked truth. We do what we tell and tell what we do. Client Partnership Client - Vendor relationship: No. We partner with clients instead.  And our sales team comprises of 100% of our clients. How we work? It’s all about being Happier first. And rest follows. Searce work culture is defined by HAPPIER. Humble: Happy people don’t carry ego around. We listen to understand; not to respond. Adaptable: We are comfortable with uncertainty. And we accept changes well. As that’s what life's about. Positive: We are super positive about work & life in general. We love to forget and forgive. We don’t hold grudges. We don’t have time or adequate space for it. Passionate: We are as passionate about the great street-food vendor across the street as about Tesla’s new model and so on. Passion is what drives us to work and makes us deliver the quality we deliver. Innovative: Innovate or Die. We love to challenge the status quo. Experimental: We encourage curiosity & making mistakes. Responsible: Driven. Self motivated. Self governing teams. We own it. Responsibilities :  As a Data Architect, you work with business leads, analysts and data scientists to understand the business domain and manage data engineers to build data products that empower better decision making. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions. If you love to solve problems using your skills, then come join the Team Searce. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself. What You’ll Do Understand the business problem and translate these to data services and engineering outcomes Explore new technologies and learn new techniques to solve business problems creatively Collaborate with many teams - engineering and business, to build better data products Manage team and handle delivery of 2-3 projects  What We’re Looking For Over 4-7 years of experience with Hands-on experience of any one programming language (Python, Java, Scala) Understanding of SQL is must Big data (Hadoop, Hive, Yarn, Sqoop) MPP platforms (Spark, Presto) Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi) Streaming engines (Kafka, Storm, Spark Streaming) Any Relational database or DW experience Any ETL tool experience Hands-on experience in pipeline design, ETL and application development Hands-on experience in cloud platforms like AWS, GCP etc. Good communication skills and strong analytical skills Experience in team handling and project delivery

Job posted by
apply for job
apply for job
Nikita Rathi picture
Nikita Rathi
Job posted by
Nikita Rathi picture
Nikita Rathi
Apply for job
apply for job

Data Engineer

Founded 2002
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
via PayU
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
2 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Role: Data Engineer  Company: PayU Location: Bangalore/ Mumbai Experience : 2-5 yrs About Company:PayU is the payments and fintech business of Prosus, a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities. The leading online payment service provider in 36 countries, PayU is dedicated to creating a fast, simple and efficient payment process for merchants and buyers. Focused on empowering people through financial services and creating a world without financial borders where everyone can prosper, PayU is one of the biggest investors in the fintech space globally, with investments totalling $700 million- to date. PayU also specializes in credit products and services for emerging markets across the globe. We are dedicated to removing risks to merchants, allowing consumers to use credit in ways that suit them and enabling a greater number of global citizens to access credit services. Our local operations in Asia, Central and Eastern Europe, Latin America, the Middle East, Africa and South East Asia enable us to combine the expertise of high growth companies with our own unique local knowledge and technology to ensure that our customers have access to the best financial services. India is the biggest market for PayU globally and the company has already invested $400 million in this region in last 4 years. PayU in its next phase of growth is developing a full regional fintech ecosystem providing multiple digital financial services in one integrated experience. We are going to do this through 3 mechanisms: build, co-build/partner; select strategic investments.  PayU supports over 350,000+ merchants and millions of consumers making payments online with over 250 payment methods and 1,800+ payment specialists. The markets in which PayU operates represent a potential consumer base of nearly 2.3 billion people and a huge growth potential for merchants.  Job responsibilities: Design infrastructure for data, especially for but not limited to consumption in machine learning applications  Define database architecture needed to combine and link data, and ensure integrity across different sources  Ensure performance of data systems for machine learning to customer-facing web and mobile applications using cutting-edge open source frameworks, to highly available RESTful services, to back-end Java based systems  Work with large, fast, complex data sets to solve difficult, non-routine analysis problems, applying advanced data handling techniques if needed  Build data pipelines, includes implementing, testing, and maintaining infrastructural components related to the data engineering stack. Work closely with Data Engineers, ML Engineers and SREs to gather data engineering requirements to prototype, develop, validate and deploy data science and machine learning solutions Requirements to be successful in this role:  Strong knowledge and experience in Python, Pandas, Data wrangling, ETL processes, statistics, data visualisation, Data Modelling and Informatica. Strong experience with scalable compute solutions such as in Kafka, Snowflake Strong experience with workflow management libraries and tools such as Airflow, AWS Step Functions etc.  Strong experience with data engineering practices (i.e. data ingestion pipelines and ETL)  A good understanding of machine learning methods, algorithms, pipelines, testing practices and frameworks  Preferred) MEng/MSc/PhD degree in computer science, engineering, mathematics, physics, or equivalent (preference: DS/ AI)  Experience with designing and implementing tools that support sharing of data, code, practices across organizations at scale

Job posted by
apply for job
apply for job
Vishakha Sonde picture
Vishakha Sonde
Job posted by
Vishakha Sonde picture
Vishakha Sonde
Apply for job
apply for job

Project Manager/Technical Delivery Manager - Azure BI

Founded 2009
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
8 - 17 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Business Intelligence Technical Delivery Project Manager, Azure Business Intelligence We are looking for an energetic and motivated Technical Business Intelligence Project Manager to join our team to meet the immediate demand.  You will be a technical project delivery manager of software solutions, with Business Intelligence and Data Warehousing experience.  You will be comfortable working ‘Client Side’ ensuring all aspects of the technical delivery of the project are in hand, from requirements gathering, definition, build and unit test through to cutover, go-live and transfer to operational support.  You will have the following experience and skills: Have knowledge and exposure to the Microsoft stack of business intelligence and data warehousing in the Azure space Multi-national technical project delivery, working with onshore and offshore teams, preferably with over 8 years’ experience in technical project delivery Delivery in a matrix structure environment, with multiple stakeholders and delivery teams from multiple suppliers Able to demonstrate experience delivering global Business Intelligence solutions utilising enterprise cloud-based platforms Excellent presentation and communication skills with the ability to provide senior management updates in clear and concise communication packs Ability to engage with the team at a technical level, with strong knowledge of ETL, Data Modelling, Date Storage and Reporting disciplines Strong understanding of Agile project delivery methodology Prince II rigor and discipline utilising RAID and other standard IT Project Management Tools The ability to define team structure and balance the various roles needed for optimum delivery output You will be a strong leader, able to take the technical teams forward providing confidence in the team to succeed   This role is perfect for you if you naturally build relationships with internal customers and lead them through the processes necessary to deliver robust Business Intelligence solutions.   We are looking for an individual who is customer focussed, and willing to roll up their sleeves and ‘get the job done’.As already mentioned, excellent communication and documentation skills are essential;   Preference will be given to candidates who provide a strong covering letter describing why they are an excellent choice for the role.

Job posted by
apply for job
apply for job
Ankit Chaurasia picture
Ankit Chaurasia
Job posted by
Ankit Chaurasia picture
Ankit Chaurasia
Apply for job
apply for job

SQL- DWH Developer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 2500000, duration: "undefined", currency: "INR", equity: false})}}

Work Days: Sunday through ThursdayWeek off: Friday & SaurdayDay Shift.Key responsibilities: Creating, designing and developing data models Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures Validating results and creating business reports Monitoring and tuning data loads and queries Develop and prepare a schedule for a new data warehouse Analyze large databases and recommend appropriate optimization for the same Administer all requirements and design various functional specifications for data Provide support to the Software Development Life cycle Prepare various code designs and ensure efficient implementation of the same Evaluate all codes and ensure the quality of all project deliverables Monitor data warehouse work and provide subject matter expertise Hands-on BI practices, data structures, data modeling, SQL skills Hard Skills for a Data Warehouse Developer: Hands-on experience with ETL tools e.g., DataStage, Informatica, Pentaho, Talend Sound knowledge of SQL Experience with SQL databases such as Oracle, DB2, and SQL Experience using Data Warehouse platforms e.g., SAP, Birst Experience designing, developing, and implementing Data Warehouse solutions Project management and system development methodology Ability to proactively research solutions and best practices Soft Skills for Data Warehouse Developers: Excellent Analytical skills Excellent verbal and written communications Strong organization skills Ability to work on a team, as well as independently

Job posted by
apply for job
apply for job
Priyanka U picture
Priyanka U
Job posted by
Priyanka U picture
Priyanka U
Apply for job
apply for job

Senior Support Engineer - App Ops / Data Ops

Founded 2001
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 12 years
Salary icon
Best in industryBest in industry

Required: 5-10 years of experience in Application and/or Data Operations Support domain. Expertise in doing RCA (root-cause analysis) and collaborating with development teams for CoE (correction of errors). Good communication & collaboration skills - liaison with product, operations & business teams to understand the requirements and provide data extracts & reports on need basis. Experience in working in an enterprise environment, with a good discipline & adherence to the SLA. Good understanding of the ticketing tools, to track the various requests and manage the lifecycle for multiple requests e.g. JIRA, Service-Now, Rally, Change-Gear etc. Orientation towards addressing the root-cause for any issue i.e. collaborate and follow-up with development teams to ensure permanent fix & prevention is given high priority. Ability to create SOPs (system operating procedures) in Confluence/Wiki to ensure there is a good reference for the support team to utilise. Self-starter and a collaborator having the ability to independently acquire the knowledge required in succeeding the job. Ability to mentor & lead Data Ops team-members for high quality of customer experience and resolution of issues on timely basis. Adherence to a well-defined process for workflow with partner teams. Specifically for Data Ops Engineer role, following experience is required: BI, Reporting & Data Warehousing domain Experience in production support for Data queries - monitoring, analysis & triage of issues Experience in using BI tools like MicroStrategy, Qlik, Power BI, Business Objects Expertise in data-analysis & writing SQL queries to provide insights into the production data.  Experience with relational database (RDBMS) & data-mart technologies like DB2, RedShift, SQL Server, My SQL, Netezza etc. Ability to monitor ETL jobs in AWS stack with tools like Tidal, Autosys etc. Experience with Big data platforms like Amazon RedShift Responsibilities: Production Support (Level 2) Job failures resolution - re-runs based on SOPs Report failures root-cause analysis & resolution Address queries for existing Reports & APIs Ad-hoc data requests for product & business stakeholders: Transactions per day, per entity (merchant, card-type, card-category) Custom extracts Ability to track & report the health of the system Create matrix for issue volume Coordinate and setup an escalation workflow Provide status-reports on regular basis for stakeholders review

Job posted by
apply for job
apply for job
Srinivas Avanthkar picture
Srinivas Avanthkar
Job posted by
Srinivas Avanthkar picture
Srinivas Avanthkar
Apply for job
apply for job

Sr. SDET (Data engineering)

Founded 2001
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 9 years
Salary icon
Best in industryBest in industry

About BlackHawk Network:Blackhawk Network is building a digital platform and products that bring people and brands together.  We facilitate cross channel payments via cash-in, cash-out and mobile payments. By leveraging blockchain, smart contracts, serverless technology, real time payment systems, we are unlocking the next million users through innovation.   Our employees are our biggest assets!  Come find out how we engage, with the biggest brands in the world.  We look for people who collaborate, who are inspirational, who have passion that can make a difference by working as a team while striving for global excellence. You can expect a strong investment in your professional growth, and a dedication to crafting a successful, sustainable career for you. Our teams are composed of highly talented and passionate 'A' players, who are also invested in mentoring and enabling the best qualities. Our vibrant culture and high expectations will kindle your passion and bring out the best in you!  As a leader in branded payments, we are building a strong diverse team and expanding in ASIA PACIFIC –we are hiring in Bengaluru, India! This is an amazing opportunity for problem solvers who want to be a part of an innovative and creative Engineering team that values your contribution to the company. If this role has your name written all over it, please contact us apply now with a resume so that we explore further and get connected. If you enjoy building world class payment applications, are highly passionate about pushing the boundaries of scale and availability on the cloud, leveraging the next horizon technologies, rapidly deliver features to production, make data driven decisions on product development, collaborate and innovate with like-minded experts, then this would be your ideal job. Blackhawk is seeking passionate backend engineers at all levels to build our next generation of payment systems on a public cloud infrastructure. Our team enjoys working together to contribute to meaningful work seen by millions of merchants worldwide.As a Senior SDET, you will work closely with data engineers to automate developed features and manual testing of the new data ETL Jobs, Data pipelines and Reports. You will be responsible for owning the complete architecture of automation framework and planning and designing automation for data ingestion, transformation and Reporting/Visualization. You will be building high-quality automation frameworks to cover end to end testing of the data platforms and ensure test data setup and pre-empt post production issues by high quality testing in the lower environments. You will get an opportunity to contribute at all levels of the test pyramid. You will also work with customer success and product teams to replicate post-production release issues. Key Qualifications Bachelor’s degree in Computer Science, Engineering or related fields 5+ years of experience testing data ingestion, visualization and info delivery systems. Real passion for data quality, reconciliation and uncovering hard to find scenarios and bugs. Proficiency in at least one programming language (preferably Python/Java) Expertise in end to end ETL (E.g. DataStage, Matillion) and BI platforms (E.g. MicroStrategy, PowerBI) testing and data validation Experience working with big data technologies such as Hadoop and MapReduce is desirable Excellent analytical, problem solving and communication skills. Self-motivated, results oriented and deadline driven. Experience with databases and data visualization and dashboarding tools would be desirable Experience working with Amazon Web Services (AWS) and Redshift is desirable Excellent knowledge of Software development lifecycle, testing Methodologies, QA terminology, processes, and tools Experience with automation using automation frameworks and tools, such as TestNG, JUnit and Selenium

Job posted by
apply for job
apply for job
Sandeep Madhavan picture
Sandeep Madhavan
Job posted by
Sandeep Madhavan picture
Sandeep Madhavan
Apply for job
apply for job

Fullstack Developer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
2 - 4 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1000000, duration: "undefined", currency: "INR", equity: false})}}

At HyperWorks Imaging, we are removing crucial business bottlenecks via our Automated AI products and solutions. Our engineers and scientists use the latest advances in deep learning and multi-modal machine learning techniques to solve diverse problems in industries ranging from materials science to marketing. We are looking to hire a Full Stack Developer to join our team. He/She will work collaboratively with data science members to write the code that connects our ML algorithms with cloud-hosted applications and services, enabling and automating data transport, storage, analysis and visualization. Remote work options available ! Responsibilities ●  Deliver high quality, maintainable code through concise, single-concern merge requests. Your code a​ lways​ includes tests for long-term maintainability and logging for operational monitoring. ●  Participate in rigorous learning-focused review of your own and others' code. Requirements: ●  Bachelor’s degree or higher in Computer Science or a related field ●  3-4 yrs work experience. ●  Expertise in AWS, Git, Docker, building ETL data pipelines, RESTful web services, and/or integrating on-premise and cloud-hosted software. ●  Expertise in Angular, Javascript, Django and Nginx. Experience w​ ith Highcharts and Fusioncharts. ●  Expertise in creating web sockets and development of ASGI applications. Experience with deploying CI/CD pipelines. Please note that we will reach out to ​ONLY​ those applicants who satisfy the criteria listed above.

Job posted by
apply for job
apply for job
Naveen Margankunte picture
Naveen Margankunte
Job posted by
Naveen Margankunte picture
Naveen Margankunte
Apply for job
apply for job

Data Engineer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 2800000, duration: "undefined", currency: "INR", equity: false})}}

Job Description We are looking for a Data Engineer that will be responsible for collecting, storing, processing, and analyzing huge sets of data that is coming from different sources. Responsibilities Working with Big Data tools and frameworks to provide requested capabilities Identify development needs in order to improve and streamline operations Develop and manage BI solutions Implementing ETL process and Data Warehousing Monitoring performance and managing infrastructure Skills Proficient understanding of distributed computing principles Proficiency with Hadoop and Spark Experience with building stream-processing systems, using solutions such as Kafka and Spark-Streaming Good knowledge of Data querying tools SQL and Hive Knowledge of various ETL techniques and frameworks Experience with Python/Java/Scala (at least one) Experience with cloud services such as AWS or GCP Experience with NoSQL databases, such as DynamoDB,MongoDB will be an advantage Excellent written and verbal communication skills

Job posted by
apply for job
apply for job
Keerthana k picture
Keerthana k
Job posted by
Keerthana k picture
Keerthana k
Apply for job
apply for job

ETL Talend developer

Founded 2011
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 19 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

Strong exposure in ETL / Big Data / Talend / Hadoop / Spark / Hive / Pig To be considered as a candidate for a Senior Data Engineer position, a person must have a proven track record of architecting data solutions on current and advanced technical platforms. They must have leadership abilities to lead a team providing data centric solutions with best practices and modern technologies in mind. They look to build collaborative relationships across all levels of the business and the IT organization. They possess analytic and problem-solving skills and have the ability to research and provide appropriate guidance for synthesizing complex information and extract business value. Have the intellectual curiosity and ability to deliver solutions with creativity and quality. Effectively work with business and customers to obtain business value for the requested work. Able to communicate technical results to both technical and non-technical users using effective story telling techniques and visualizations. Demonstrated ability to perform high quality work with innovation both independently and collaboratively.

Job posted by
apply for job
apply for job
Shobha B K picture
Shobha B K
Job posted by
Shobha B K picture
Shobha B K
Apply for job
apply for job

Technical Architect/CTO

Founded 2019
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore), Paris
Experience icon
6 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1600000, max: 2200000, duration: "undefined", currency: "INR", equity: false})}}

Exciting opportunity for any contractor to work with a start-up firm which is into product cum service based industry. We are looking for someone who has got rich experience in below mentioned skills to join us immediately.This role is for 1 month where the person will be working from Client site in Paris to understand the system architecture and documenting them. Contract extension for this role will be purely on the performance of individual. Since the requirement is immediate and critical, we need someone who can join us soon and travel to Paris in December- Hands on experience handling multiple data sources/datasets- experience in data/BI architect role- Expert on SSIS, SSRS, SSAS- Should have knowledge writing MDX queries- Technical document preparation- Should have excellent communication- Process oriented- Strong project management- Should be able to think Out of the Box and provide ideas to have better solutions- Outstanding team player with positive attitude

Job posted by
apply for job
apply for job
Ajith Gopi picture
Ajith Gopi
Job posted by
Ajith Gopi picture
Ajith Gopi
Apply for job
apply for job

Senior BI & ETL Developer

Founded 1999
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
via Wibmo
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Critical Tasks and Expected Contributions/Results : The role will be primarily focused on the design, development and testing of ETL workflows (using Talend) as well as the batch management and error handling processes. Build Business Intelligence Applications using tools like Power BI. Additional responsibilities include the documentation of technical specifications and related project artefacts. - Gather requirement and propose possible ETL solutions for in-house designed Data Warehouse - Analyze & translate functional specifications & change requests into technical specifications. - Design and Creating star schema data models - Design, Build and Implement Business Intelligence Solutions using Power BI - Develop, implement & test ETL program logic. - Deployment and support any related issues Key Competency : - A good understanding of the concepts and best practices of data warehouse ETL design and be able to apply these suitably to solve specific business needs. - Expert knowledge of ETL tool like Talend - Have more than 8 years experience in designing and developing ETL work packages, and be able to demonstrate expertise in ETL tool- Talend - Knowledge of BI tools like Power BI is required - Ability to follow functional ETL specifications and challenge business logic and schema design where appropriate, as well as manage their time effectively. - Exposure to Performance tuning is essential - Good organisational skills. - Methodical and structured approach to design and development. - Good interpersonal skills.

Job posted by
apply for job
apply for job
Shirin AM picture
Shirin AM
Job posted by
Shirin AM picture
Shirin AM
Apply for job
apply for job

Database Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
apply for job
apply for job
Rahul Malani picture
Rahul Malani
Job posted by
Rahul Malani picture
Rahul Malani
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.
File upload not supportedAudio recording not supported
This browser does not support file upload. Please follow the instructions to upload your resume.This browser does not support audio recording. Please follow the instructions to record audio.
  1. Click on the 3 dots
  2. Click on "Copy link"
  3. Open Google Chrome (or any other browser) and enter the copied link in the URL bar
Done