Loading...

{{notif_text}}

Join the fight against Covid-19 in India. Check the resources we have curated for you here.https://fightcovid.cutshort.io

Hadoop Jobs in Mumbai

Explore top Hadoop Job opportunities in Mumbai for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Data Engineer

Founded 1997
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 2500000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

What  is the roles objective : Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems.   What skills do you need to possess? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured data-sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected data-sets. Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and No SQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Job posted by
apply for job
apply for job
Harsha Thota picture
Harsha Thota
Job posted by
Harsha Thota picture
Harsha Thota
Apply for job
apply for job

Data Architect

Founded 2004
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 2200000, duration: "undefined", currency: "INR", equity: false})}}

JD of Data ArchitectAs a Data Architect, you work with business leads, analysts and data scientists to understand the business domain and manage data engineers to build data products that empower better decision making. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions.If you love to solve problems using your skills, then come join the Team Searce. We have acasual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.What You’ll Do● Understand the business problem and translate these to data services and engineeringoutcomes● Explore new technologies and learn new techniques to solve business problemscreatively● Collaborate with many teams - engineering and business, to build better data products● Manage team and handle delivery of 2-3 projectsWhat We’re Looking For● Over 4-6 years of experience with○ Hands-on experience of any one programming language (Python, Java, Scala)○ Understanding of SQL is must○ Big data (Hadoop, Hive, Yarn, Sqoop)○ MPP platforms (Spark, Presto)○ Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)○ Streaming engines (Kafka, Storm, Spark Streaming)○ Any Relational database or DW experience○ Any ETL tool experience● Hands-on experience in pipeline design, ETL and application development● Hands-on experience in cloud platforms like AWS, GCP etc.● Good communication skills and strong analytical skills● Experience in team handling and project delivery

Job posted by
apply for job
apply for job
Reena Bandekar picture
Reena Bandekar
Job posted by
Reena Bandekar picture
Reena Bandekar
Apply for job
apply for job

Data Engineer

Founded 2004
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

JD of Data EngineerAs a Data Engineer, you are a full-stack data engineer that loves solving business problems.You work with business leads, analysts and data scientists to understand the business domainand engage with fellow engineers to build data products that empower better decision making.You are passionate about data quality of our business metrics and flexibility of your solution thatscales to respond to broader business questions.If you love to solve problems using your skills, then come join the Team Searce. We have acasual and fun office environment that actively steers clear of rigid "corporate" culture, focuseson productivity and creativity, and allows you to be part of a world-class team while still beingyourself.What You’ll Do● Understand the business problem and translate these to data services and engineeringoutcomes● Explore new technologies and learn new techniques to solve business problemscreatively● Think big! and drive the strategy for better data quality for the customers● Collaborate with many teams - engineering and business, to build better data productsWhat We’re Looking For● Over 1-3 years of experience with○ Hands-on experience of any one programming language (Python, Java, Scala)○ Understanding of SQL is must○ Big data (Hadoop, Hive, Yarn, Sqoop)○ MPP platforms (Spark, Pig, Presto)○ Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)○ Streaming engines (Kafka, Storm, Spark Streaming)○ Any Relational database or DW experience○ Any ETL tool experience● Hands-on experience in pipeline design, ETL and application development

Job posted by
apply for job
apply for job
Reena Bandekar picture
Reena Bandekar
Job posted by
Reena Bandekar picture
Reena Bandekar
Apply for job
apply for job

DevOps Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
via Karza
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
1 - 5 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

At Karza technologies, we take pride in building one of the most comprehensive digital onboarding & due-diligence platforms by profiling millions of entities and trillions of associations amongst them using data collated from more than 700 publicly available government sources. Primarily in the B2B Fintech Enterprise space, we are headquartered in Mumbai in Lower Parel with 100+ strong workforce. We are truly furthering the cause of Digital India by providing the entire BFSI ecosystem with tech products and services that aid onboarding customers, automating processes and mitigating risks seamlessly, in real-time and at fraction of the current cost. A few recognitions: Recognized as Top25 startups in India to work with 2019 by LinkedIn Winner of HDFC Bank's Digital Innovation Summit 2020 Super Winners (Won every category) at Tecnoviti 2020 by Banking Frontiers Winner of Amazon AI Award 2019 for Fintech Winner of FinTech Spot Pitches at Fintegrate Zone 2018 held at BSE Winner of FinShare 2018 challenge held by ShareKhan Only startup in Yes Bank Global Fintech Accelerator to win the account during the Cohort 2nd place Citi India FinTech Challenge 2018 by Citibank Top 3 in Viacom18's Startup Engagement Programme VStEP   What your average day would look like: Deploy and maintain mission-critical information extraction, analysis, and management systems Manage low cost, scalable streaming data pipelines Provide direct and responsive support for urgent production issues Contribute ideas towards secure and reliable Cloud architecture Use open source technologies and tools to accomplish specific use cases encountered within the project Use coding languages or scripting methodologies to solve automation problems Collaborate with others on the project to brainstorm about the best way to tackle a complex infrastructure, security, or deployment problem Identify processes and practices to streamline development & deployment to minimize downtime and maximize turnaround time   What you need to work with us: Proficiency in at least one of the general-purpose programming languages like Python, Java, etc. Experience in managing the IAAS and PAAS components on popular public Cloud Service Providers like AWS, Azure, GCP etc. Proficiency in Unix Operating systems and comfortable with Networking concepts Experience with developing/deploying a scalable system Experience with the Distributed Database & Message Queues (like Cassandra, ElasticSearch, MongoDB, Kafka, etc.) Experience in managing Hadoop clusters Understanding of containers and have managed them in production using container orchestration services. Solid understanding of data structures and algorithms. Applied exposure to continuous delivery pipelines (CI/CD). Keen interest and proven track record in automation and cost optimization.   Experience: 1-4 years of relevant experience BE in Computer Science / Information Technology

Job posted by
apply for job
apply for job
Priyanka Asher picture
Priyanka Asher
Job posted by
Priyanka Asher picture
Priyanka Asher
Apply for job
apply for job

Data Engineer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 9 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 1200000, duration: "undefined", currency: "INR", equity: false})}}

Job Overview :Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.Responsibilities and Duties :- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutionsEducation level :- Bachelor's degree in Computer Science or equivalentExperience :- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development- Expertise in application, data and infrastructure architecture disciplines- Expert designing data integrations using ETL and other data integration patterns- Advanced knowledge of architecture, design and business processes Proficiency in :- Modern programming languages like Java, Python, Scala- Big Data technologies Hadoop, Spark, HIVE, Kafka- Writing decently optimized SQL queries- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensionalmodeling, and Meta data modeling practices.- Experience generating physical data models and the associated DDL from logical data models.- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,and data rationalization artifacts.- Experience enforcing data modeling standards and procedures.- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goalsSkills :Must Know :- Core big-data concepts- Spark - PySpark/Scala- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)- Handling of various file formats- Cloud platform - AWS/Azure/GCP- Orchestration tool - Airflow

Job posted by
apply for job
apply for job
Aishwarya Hire picture
Aishwarya Hire
Job posted by
Aishwarya Hire picture
Aishwarya Hire
Apply for job
apply for job

Data Engineer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 7 years
Salary icon
Best in industry{{renderSalaryString({min: 700000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Job Overview :Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.Responsibilities and Duties :- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutionsEducation level :- Bachelor's degree in Computer Science or equivalentExperience :- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development- Expertise in application, data and infrastructure architecture disciplines- Expert designing data integrations using ETL and other data integration patterns- Advanced knowledge of architecture, design and business processes Proficiency in :- Modern programming languages like Java, Python, Scala- Big Data technologies Hadoop, Spark, HIVE, Kafka- Writing decently optimized SQL queries- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensionalmodeling, and Meta data modeling practices.- Experience generating physical data models and the associated DDL from logical data models.- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,and data rationalization artifacts.- Experience enforcing data modeling standards and procedures.- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goalsSkills :Must Know :- Core big-data concepts- Spark - PySpark/Scala- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)- Handling of various file formats- Cloud platform - AWS/Azure/GCP- Orchestration tool - Airflow

Job posted by
apply for job
apply for job
Aishwarya Hire picture
Aishwarya Hire
Job posted by
Aishwarya Hire picture
Aishwarya Hire
Apply for job
apply for job

Tech Lead
Tech Lead
at Mintifiat Mintifi

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 10 years
Salary icon
Best in industry{{renderSalaryString({min: 800000, max: 2000000, duration: "undefined", currency: "INR", equity: false})}}

Job Title: Technology Lead Responsibilities We are looking for a Technology Lead who can drive innovation and take ownership and deliver results. • Own one or more modules of the project under development • Conduct system wide requirement analysis. • Quality, on time delivery of agreed deliverables. • Mentor junior team members • Flexible in working under changing and different work settings. • Contribute to the company knowledge base and process improvements. • Participate in SDLC • Design and implement automated unit testing framework as required • Use best practices and coding standards. • Conduct peer reviews and lead reviews and provide feedback • Develop, maintain, troubleshoot, enhance and document components developed by self and others as per the requirements and detailed design Qualifications • Excellent programming experience of 5 to 10 years in Ruby and Ruby on Rails • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Exposure to front-end technologies like HTML, CSS, Javascript as well as JS libraries/frameworks like JQuery, Angular, React etc. is a strong plus • Exposure to DevOps on AWS is a strong plus Compensation Best in the industry Job Location: Mumbai

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Big Data Developer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 4 years
Salary icon
Best in industry{{renderSalaryString({min: 600000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Data Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida), Mumbai
Experience icon
2 - 7 years
Salary icon
Best in industry{{renderSalaryString({min: 1000000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

About the job: - You will work with data scientists to architect, code and deploy ML models - You will solve problems of storing and analyzing large scale data in milliseconds - architect and develop data processing and warehouse systems - You will code, drink, breathe and live python, sklearn and pandas. It’s good to have experience in these but not a necessity - as long as you’re super comfortable in a language of your choice. - You will develop tools and products that provide analysts ready access to the data About you: - Strong CS fundamentals - You have strong experience in working with production environments - You write code that is clean, readable and tested - Instead of doing it second time, you automate it - You have worked with some of the commonly used databases and computing frameworks (Psql, S3, Hadoop, Hive, Presto, Spark, etc) - It will be great if you have one of the following to share - a kaggle or a github profile - You are an expert in one or more programming languages (Python preferred). Also good to have experience with python-based application development and data science libraries. - Ideally, you have 2+ years of experience in tech and/or data. - Degree in CS/Maths from Tier-1 institutes.

Job posted by
apply for job
apply for job
Pragya Singh picture
Pragya Singh
Job posted by
Pragya Singh picture
Pragya Singh
Apply for job
apply for job

Hadoop Developer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 100 years
Salary icon
Best in industry{{renderSalaryString({min: 400000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Looking for Big data Developers in Mumbai Location

Job posted by
apply for job
apply for job
Sheela P picture
Sheela P
Job posted by
Sheela P picture
Sheela P
Apply for job
apply for job

Technical Architect/CTO

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 11 years
Salary icon
Best in industry{{renderSalaryString({min: 1500000, max: 3000000, duration: "undefined", currency: "INR", equity: false})}}

ABOUT US: Arque Capital is a FinTech startup working with AI in Finance in domains like Asset Management (Hedge Funds, ETFs and Structured Products), Robo Advisory, Bespoke Research, Alternate Brokerage, and other applications of Technology & Quantitative methods in Big Finance. PROFILE DESCRIPTION: 1. Get the "Tech" in order for the Hedge Fund - Help answer fundamentals of technology blocks to be used, choice of certain platform/tech over other, helping team visualize product with the available resources and assets 2. Build, manage, and validate a Tech Roadmap for our Products 3. Architecture Practices - At startups, the dynamics changes very fast. Making sure that best practices are defined and followed by team is very important. CTO’s may have to garbage guy and clean the code time to time. Making reviews on Code Quality is an important activity that CTO should follow. 4. Build progressive learning culture and establish predictable model of envisioning, designing and developing products 5. Product Innovation through Research and continuous improvement 6. Build out the Technological Infrastructure for the Hedge Fund 7. Hiring and building out the Technology team 8. Setting up and managing the entire IT infrastructure - Hardware as well as Cloud 9. Ensure company-wide security and IP protection REQUIREMENTS: Computer Science Engineer from Tier-I colleges only (IIT, IIIT, NIT, BITS, DHU, Anna University, MU) 5-10 years of relevant Technology experience (no infra or database persons) Expertise in Python and C++ (3+ years minimum) 2+ years experience of building and managing Big Data projects Experience with technical design & architecture (1+ years minimum) Experience with High performance computing - OPTIONAL Experience as a Tech Lead, IT Manager, Director, VP, or CTO 1+ year Experience managing Cloud computing infrastructure (Amazon AWS preferred) - OPTIONAL Ability to work in an unstructured environment Looking to work in a small, start-up type environment based out of Mumbai COMPENSATION: Co-Founder status and Equity partnership

Job posted by
apply for job
apply for job
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Job posted by
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Apply for job
apply for job

Data Scientist

Founded 2014
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai
Experience icon
4 - 8 years
Salary icon
Best in industry{{renderSalaryString({min: 500000, max: 1500000, duration: "undefined", currency: "INR", equity: false})}}

Nextalytics is an offshore research, development and consulting company based in India that focuses on high quality and cost effective software development and data science solutions. At Nextalytics, we have developed a culture that encourages employees to be creative, innovative, and playful. We reward intelligence, dedication and out-of-the-box thinking; if you have these, Nextalytics will be the perfect launch pad for your dreams. Nextalytics is looking for smart, driven and energetic new team members.

Job posted by
apply for job
apply for job
Harshal Patni picture
Harshal Patni
Job posted by
Harshal Patni picture
Harshal Patni
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.
File upload not supportedAudio recording not supported
This browser does not support file upload. Please follow the instructions to upload your resume.This browser does not support audio recording. Please follow the instructions to record audio.
  1. Click on the 3 dots
  2. Click on "Copy link"
  3. Open Google Chrome (or any other browser) and enter the copied link in the URL bar
Done