Loading...

{{notif_text}}

Let the best companies discover your talent - Check out Cutshort Discovered

Hadoop Jobs in Mumbai

Explore top Hadoop Job opportunities in Mumbai for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Big Data Developer
Big Data Developer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 7 years
Experience icon
Best in industry8 - 10 lacs/annum

Spark, Storm, Flink, Kafka, Redis, Java. Database Technologies like Cassandra, MySQL & PostgresSQL, Elastic Search. Having knowledge of Python & Machine Learning/Artificial Intelligence Technologies would be advantageous. Working knowledge of Microsoft Azure or Amazon AWS. Building tools: Maven, Jenkins, Ant. TFS, Git or SVN exposure.

Job posted by
apply for job
apply for job
Srushti Sonawane picture
Srushti Sonawane
Job posted by
Srushti Sonawane picture
Srushti Sonawane
Apply for job
apply for job

Data Engineer
Data Engineer

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
6 - 13 years
Experience icon
Best in industry15 - 60 lacs/annum

Responsibilities: Build real-time and batch analytics platform for analytics & machine-learning. Design, propose and develop solutions keeping the growing scale & business requirements in mind. As an integral part of the Data Engineering team, be involved in the entire development lifecycle from conceptualisation to architecture to coding to unit testing. Help us design the Data Model for our data warehouse and other data engineering solutions. Requirements: Deep understanding of real-time as well as batch processing big data solutions (Spark, Storm, Kafka, KSql, Flink, MapReduce, Yarn, Hive, HDFS, Pig etc). Extensive experience developing applications that work with NoSQL stores (e.g.,Elastic Search, HBase, Cassandra, MongoDB). Understands Data very well and has fair Data Modelling experience. Proven programming experience in Java or Scala. Experience in gathering and processing raw data at scale including writing scripts, web scraping, calling APIs, writing SQL queries, etc. Experience in cloud based data stores like Redshift and Big Query is an advantage. Previous experience in a high-growth tech startup would be an advantage.

Job posted by
apply for job
apply for job
Jitendra Chhunchha picture
Jitendra Chhunchha
Job posted by
Jitendra Chhunchha picture
Jitendra Chhunchha
Apply for job
apply for job

Data warehouse Architect
Data warehouse Architect

Founded 2006
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
7 - 15 years
Experience icon
Best in industry35 - 40 lacs/annum

1. KEY OBJECTIVE OF THE JOB To closely work with various users, product management team and the tech team to design, develop and strategize Data Architecture and multidimensional databases. 2. MAJOR DELIVERABLES: • Design end-to-end BI and Analytics platform and present to tech and business stakeholders • Evaluate multiple tools and conduct Proofs-of-Concept of the same basis requirements and budgets • Ability to perform Dimensional Modelling of multiple data-marts and enterprise data warehouse from scratch • Understand complex OLTP (Online Transaction Processing) systems such as Order Booking, CRM, Finance, Web etc. and map schemas and data dictionaries from them • Understand business rules around data entities and document them • Map the business rules and OLTP entities to a dimensional model spread across multiple data marts and warehouses • Design a robust and failsafe ETL (Extract, Transform & Load) process without relying on any tool • Operationalise the ETL using shell and SQL scripts without the need for any tool • Operationalise the dimensional model and the warehousing architecture using simple standalone databases like MySQL and Postgres on Linux, or on Cloud based systems like Redshift etc. • Model data lakes for lightly structured but highly voluminous clickstream data using Hadoop and similar technologies • Extremely hands-on person who loves to create a blueprint as well as write scripts, make presentations and even setup end-to-end PoCs (Proof of Concepts) on his/her own • Coordinate among Data Scientists, Technology Partners, Business Users, Analysts etc, and make sure they are able to use the OLAP (Online Analytical Processing) in the intended way • Understand the pain points of the above stakeholders and continuously iterate the existing platform with a completely open mind to meet their needs. • Track and Continuously tune the data infrastructure for performance and scale 3. RIGHT PERSON : Essential Attributes • Dimensional Modelling and Schema Design for OLAP/BI • Command over Multiple ETL, DW/Data mart and BI tools • Experience on HANA, TALEND will be of added advantage • Solution Design and Documentation • Big Data Architecture Designing ( HADOOP and related ecosystem) • Propensity towards Hands-On/Start-up working environment Desirable Attributes • Big Data and Machine Learning • Data Science and Statistics • Ecommerce or Retail domain experience Profile An engineering and a tech enthusiast, with total experience of atleast 10 years with 5 to 6 years of experience in data warehouse architecture with an ability to think logical, ability to address issues related to data migration, understands the importance of data dictionaries and has strong desire to establish best practices will fit the bill.

Job posted by
apply for job
apply for job
Mitali Jain picture
Mitali Jain
Job posted by
Mitali Jain picture
Mitali Jain
Apply for job
apply for job

Tech Lead
Tech Lead

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 10 years
Experience icon
Best in industry8 - 20 lacs/annum

Job Title: Technology Lead Responsibilities We are looking for a Technology Lead who can drive innovation and take ownership and deliver results. • Own one or more modules of the project under development • Conduct system wide requirement analysis. • Quality, on time delivery of agreed deliverables. • Mentor junior team members • Flexible in working under changing and different work settings. • Contribute to the company knowledge base and process improvements. • Participate in SDLC • Design and implement automated unit testing framework as required • Use best practices and coding standards. • Conduct peer reviews and lead reviews and provide feedback • Develop, maintain, troubleshoot, enhance and document components developed by self and others as per the requirements and detailed design Qualifications • Excellent programming experience of 5 to 10 years in Ruby and Ruby on Rails • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Exposure to front-end technologies like HTML, CSS, Javascript as well as JS libraries/frameworks like JQuery, Angular, React etc. is a strong plus • Exposure to DevOps on AWS is a strong plus Compensation Best in the industry Job Location: Mumbai

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Big Data Engineer
Big Data Engineer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai, Noida, NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 2 years
Experience icon
Best in industry4 - 10 lacs/annum

Job Requirement Installation, configuration and administration of Big Data components (including Hadoop/Spark) for batch and real-time analytics and data hubs Capable of processing large sets of structured, semi-structured and unstructured data Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review. Familiar with data architecture for designing data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing Optional - Visual communicator ability to convert and present data in an easy comprehensible visualization using tools like D3.js, Tableau To enjoy being challenged, solve complex problems on a daily basis Proficient in executing efficient and robust ETL workflows To be able to work in teams and collaborate with others to clarify requirements To be able to tune Hadoop solutions to improve performance and end-user experience To have strong co-ordination and project management skills to handle complex projects Engineering background

Job posted by
apply for job
apply for job
Sneha Pandey picture
Sneha Pandey
Job posted by
Sneha Pandey picture
Sneha Pandey
Apply for job
apply for job

Big Data Developer
Big Data Developer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 4 years
Experience icon
Best in industry6 - 15 lacs/annum

Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job