Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Hadoop Jobs in Mumbai

Explore top Hadoop Job opportunities in Mumbai for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Cloud Engineer

via UpGrad
Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
Best in industry8 - 14 lacs/annum

About Us UpGrad is an IIT Delhi alumni and Ronnie Screwvala founded company where we focus on enabling universities to take their programs online. Given team's background in education and media sectors, we understand what it takes to offer quality online programs, and at UpGrad - we invest alongside universities to build and deliver quality online programs (content, platform, technology, industry collaboration, delivery, and grading infrastructure). You can read about some of our press releases at - • UpGrad was earlier selected as one of the top ten most innovative companies in India by FastCompany. • We were also covered by the Financial Times along with other disruptors in Ed-Tech • UpGrad is the official education partner for Government of India - Startup India program too • We were also ranked as one of the top 25 Startups in India 2018 • Our program with IIIT B has been ranked #1 program in the country in the domain of Artificial Intelligence and Machine Learning At UpGrad - we have partnered with leading universities such as IIIT Bangalore, BITS Pilani, MICA Ahmedabad, IMT Ghaziabad and Cambridge University's Judge Business School to offer programs in the domains of Data, Technology and Management. Role and Responsibilities 1. Administration of virtual learning lab: Handle the setup and administration of the virtual labs to be used by the students enrolled in various courses like Big Data, Data Analytics. The students use these labs for practice and also run their assignments. 2. Student experience (post-program launch): Assist students with their academic doubts related to the virtual labs and ensure students have a great learning experience on the UpGrad platform 3. Academic quality assurance: Help create learning material with an in-house team of instructional designers and review its technical quality. What we are looking for: 1. 3-4 years project experience deploying cloud solutions (experience on Amazon Web Services (AWS) is mandatory) 2. Hands-on experience in setting up and day to day administration of Hadoop Ecosystem Tools(Hadoop, Spark, Storm, Hbase), NoSQL, Visualisations, etc. 3. Must be a problem solver with demonstrated experience in solving difficult technology challenges, with a can-do attitude 4. Hands-on working with private or public cloud services in a highly available and scalable production environment. 5. Experience building tools and automation that eliminate repetitive task 6. Hands on experience with Service Cloud, including User Permissions, Roles, Objects, Validation Rules, Process Builder, Workflow Rules, Communities, Visual Workflow, Email to Case, Case Management

Job posted by
apply for job
apply for job
Omkar Pradhan picture
Omkar Pradhan
Job posted by
Omkar Pradhan picture
Omkar Pradhan
Apply for job
apply for job

Software Engineer - Database

Founded 2006
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 10 years
Experience icon
Best in industry6 - 20 lacs/annum

This role will be responsible for developing and deploying a game-changing and highly-disruptive advertising technology platform. This person would also take on the following responsibilities: Gather and process raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Work closely with our engineering team to integrate your amazing innovations and algorithms into our production systems Support business decisions with ad hoc analysis as needed Propose and investigate new techniques Troubleshoot production issues and identify practical solutions Routine check-up, back-up and monitoring of the entire MySQL and Hadoop ecosystem Take end-to-end responsibility of the Traditional Databases (MySQL), Big Data ETL, Analysis and processing Life Cycle in the organization Build, deploy and maintain real-time streaming pipelines and real-time analytics Manage deployments of big-data clusters across private and public cloud platforms

Job posted by
apply for job
apply for job
Rupal Nargolia picture
Rupal Nargolia
Job posted by
Rupal Nargolia picture
Rupal Nargolia
Apply for job
apply for job

Tech Lead

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 10 years
Experience icon
Best in industry8 - 20 lacs/annum

Job Title: Technology Lead Responsibilities We are looking for a Technology Lead who can drive innovation and take ownership and deliver results. • Own one or more modules of the project under development • Conduct system wide requirement analysis. • Quality, on time delivery of agreed deliverables. • Mentor junior team members • Flexible in working under changing and different work settings. • Contribute to the company knowledge base and process improvements. • Participate in SDLC • Design and implement automated unit testing framework as required • Use best practices and coding standards. • Conduct peer reviews and lead reviews and provide feedback • Develop, maintain, troubleshoot, enhance and document components developed by self and others as per the requirements and detailed design Qualifications • Excellent programming experience of 5 to 10 years in Ruby and Ruby on Rails • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Exposure to front-end technologies like HTML, CSS, Javascript as well as JS libraries/frameworks like JQuery, Angular, React etc. is a strong plus • Exposure to DevOps on AWS is a strong plus Compensation Best in the industry Job Location: Mumbai

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Big Data Engineer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai, Noida, NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 2 years
Experience icon
Best in industry4 - 10 lacs/annum

Job Requirement Installation, configuration and administration of Big Data components (including Hadoop/Spark) for batch and real-time analytics and data hubs Capable of processing large sets of structured, semi-structured and unstructured data Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review. Familiar with data architecture for designing data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing Optional - Visual communicator ability to convert and present data in an easy comprehensible visualization using tools like D3.js, Tableau To enjoy being challenged, solve complex problems on a daily basis Proficient in executing efficient and robust ETL workflows To be able to work in teams and collaborate with others to clarify requirements To be able to tune Hadoop solutions to improve performance and end-user experience To have strong co-ordination and project management skills to handle complex projects Engineering background

Job posted by
apply for job
apply for job
Sneha Pandey picture
Sneha Pandey
Job posted by
Sneha Pandey picture
Sneha Pandey
Apply for job
apply for job

Big Data Developer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 4 years
Experience icon
Best in industry6 - 15 lacs/annum

Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Data Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida), Mumbai
Experience icon
2 - 7 years
Experience icon
Best in industry10 - 30 lacs/annum

About the job: - You will work with data scientists to architect, code and deploy ML models - You will solve problems of storing and analyzing large scale data in milliseconds - architect and develop data processing and warehouse systems - You will code, drink, breathe and live python, sklearn and pandas. It’s good to have experience in these but not a necessity - as long as you’re super comfortable in a language of your choice. - You will develop tools and products that provide analysts ready access to the data About you: - Strong CS fundamentals - You have strong experience in working with production environments - You write code that is clean, readable and tested - Instead of doing it second time, you automate it - You have worked with some of the commonly used databases and computing frameworks (Psql, S3, Hadoop, Hive, Presto, Spark, etc) - It will be great if you have one of the following to share - a kaggle or a github profile - You are an expert in one or more programming languages (Python preferred). Also good to have experience with python-based application development and data science libraries. - Ideally, you have 2+ years of experience in tech and/or data. - Degree in CS/Maths from Tier-1 institutes.

Job posted by
apply for job
apply for job
Pragya Singh picture
Pragya Singh
Job posted by
Pragya Singh picture
Pragya Singh
Apply for job
apply for job

Big Data Engineer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 7 years
Experience icon
Best in industry4 - 20 lacs/annum

As a Big Data Engineer, you will build utilities that would help orchestrate migration of massive Hadoop/Big Data systems onto public cloud systems. You would build data processing scripts and pipelines that serve several of jobs and queries per day. The services you build will integrate directly with cloud services, opening the door to new and cutting-edge re-usable solutions. You will work with engineering teams, co-workers, and customers to gain new insights and dream of new possibilities. The Big Data Engineering team is hiring in the following areas: • Distributed storage and compute solutions • Data ingestion, consolidation, and warehousing • Cloud migrations and replication pipelines • Hybrid on-premise and in-cloud Big Data solutions • Big Data, Hadoop and spark processing Basic Requirements: • 2+ years’ experience of Hands-on in data structures, distributed systems, Hadoop and spark, SQL and NoSQL Databases • Strong software development skills in at least one of: Java, C/C++, Python or Scala. • Experience building and deploying cloud-based solutions at scale. • Experience in developing Big Data solutions (migration, storage, processing) • BS, MS or PhD degree in Computer Science or Engineering, and 5+ years of relevant work experience in Big Data and cloud systems. • Experience building and supporting large-scale systems in a production environment. Technology Stack: Cloud Platforms – AWS, GCP or Azure Big Data Distributions – Any of Apache Hadoop/CDH/HDP/EMR/Google DataProc/HD-Insights Distributed processing Frameworks – One or more of MapReduce, Apache Spark, Apache Storm, Apache Flink. Database/warehouse – Hive, HBase, and at least one cloud-native services Orchestration Frameworks – Any of Airflow, Oozie, Apache NiFi, Google DataFlow Message/Event Solutions – Any of Kafka, Kinesis, Cloud pub-sub Container Orchestration (Good to have)– Kubernetes or Swarm

Job posted by
apply for job
apply for job
Anwar Shaikh picture
Anwar Shaikh
Job posted by
Anwar Shaikh picture
Anwar Shaikh
Apply for job
apply for job

Hadoop Developer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 20+ years
Experience icon
Best in industry4 - 15 lacs/annum

Looking for Big data Developers in Mumbai Location

Job posted by
apply for job
apply for job
Sheela P picture
Sheela P
Job posted by
Sheela P picture
Sheela P
Apply for job
apply for job

Technical Architect/CTO

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 11 years
Experience icon
Best in industry15 - 30 lacs/annum

ABOUT US: Arque Capital is a FinTech startup working with AI in Finance in domains like Asset Management (Hedge Funds, ETFs and Structured Products), Robo Advisory, Bespoke Research, Alternate Brokerage, and other applications of Technology & Quantitative methods in Big Finance. PROFILE DESCRIPTION: 1. Get the "Tech" in order for the Hedge Fund - Help answer fundamentals of technology blocks to be used, choice of certain platform/tech over other, helping team visualize product with the available resources and assets 2. Build, manage, and validate a Tech Roadmap for our Products 3. Architecture Practices - At startups, the dynamics changes very fast. Making sure that best practices are defined and followed by team is very important. CTO’s may have to garbage guy and clean the code time to time. Making reviews on Code Quality is an important activity that CTO should follow. 4. Build progressive learning culture and establish predictable model of envisioning, designing and developing products 5. Product Innovation through Research and continuous improvement 6. Build out the Technological Infrastructure for the Hedge Fund 7. Hiring and building out the Technology team 8. Setting up and managing the entire IT infrastructure - Hardware as well as Cloud 9. Ensure company-wide security and IP protection REQUIREMENTS: Computer Science Engineer from Tier-I colleges only (IIT, IIIT, NIT, BITS, DHU, Anna University, MU) 5-10 years of relevant Technology experience (no infra or database persons) Expertise in Python and C++ (3+ years minimum) 2+ years experience of building and managing Big Data projects Experience with technical design & architecture (1+ years minimum) Experience with High performance computing - OPTIONAL Experience as a Tech Lead, IT Manager, Director, VP, or CTO 1+ year Experience managing Cloud computing infrastructure (Amazon AWS preferred) - OPTIONAL Ability to work in an unstructured environment Looking to work in a small, start-up type environment based out of Mumbai COMPENSATION: Co-Founder status and Equity partnership

Job posted by
apply for job
apply for job
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Job posted by
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Apply for job
apply for job

Freelance Faculty

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere, United States, Canada
Experience icon
3 - 10 years
Experience icon
Best in industry2 - 10 lacs/annum

To introduce myself I head Global Faculty Acquisition for Simplilearn. About My Company: SIMPLILEARN is a company which has transformed 500,000+ carriers across 150+ countries with 400+ courses and yes we are a Registered Professional Education Provider providing PMI-PMP, PRINCE2, ITIL (Foundation, Intermediate & Expert), MSP, COBIT, Six Sigma (GB, BB & Lean Management), Financial Modeling with MS Excel, CSM, PMI - ACP, RMP, CISSP, CTFL, CISA, CFA Level 1, CCNA, CCNP, Big Data Hadoop, CBAP, iOS, TOGAF, Tableau, Digital Marketing, Data scientist with Python, Data Science with SAS & Excel, Big Data Hadoop Developer & Administrator, Apache Spark and Scala, Tableau Desktop 9, Agile Scrum Master, Salesforce Platform Developer, Azure & Google Cloud. : Our Official website : www.simplilearn.com If you're interested in teaching, interacting, sharing real life experiences and passion to transform Careers, please join hands with us. Onboarding Process • Updated CV needs to be sent to my email id , with relevant certificate copy. • Sample ELearning access will be shared with 15days trail post your registration in our website. • My Subject Matter Expert will evaluate you on your areas of expertise over a telephonic conversation - Duration 15 to 20 minutes • Commercial Discussion. • We will register you to our on-going online session to introduce you to our course content and the Simplilearn style of teaching. • A Demo will be conducted to check your training style, Internet connectivity. • Freelancer Master Service Agreement Payment Process : • Once a workshop/ Last day of the training for the batch is completed you have to share your invoice. • An automated Tracking Id will be shared from our automated ticketing system. • Our Faculty group will verify the details provided and share the invoice to our internal finance team to process your payment, if there are any additional information required we will co-ordinate with you. • Payment will be processed in 15 working days as per the policy this 15 days is from the date of invoice received. Please share your updated CV to get this for next step of on-boarding process.

Job posted by
apply for job
apply for job
STEVEN JOHN picture
STEVEN JOHN
Job posted by
STEVEN JOHN picture
STEVEN JOHN
Apply for job
apply for job

Data Scientist

Founded 2014
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai
Experience icon
4 - 8 years
Experience icon
Best in industry5 - 15 lacs/annum

Nextalytics is an offshore research, development and consulting company based in India that focuses on high quality and cost effective software development and data science solutions. At Nextalytics, we have developed a culture that encourages employees to be creative, innovative, and playful. We reward intelligence, dedication and out-of-the-box thinking; if you have these, Nextalytics will be the perfect launch pad for your dreams. Nextalytics is looking for smart, driven and energetic new team members.

Job posted by
apply for job
apply for job
Harshal Patni picture
Harshal Patni
Job posted by
Harshal Patni picture
Harshal Patni
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.