Loading...

{{notif_text}}

Work at top Indian companies and global startups in 2020 - Check it out

Hadoop Jobs in Mumbai

Explore top Hadoop Job opportunities in Mumbai for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Data Engineer
Data Engineer

Founded 1993
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere
Experience icon
3 - 10 years
Experience icon
Best in industry10 - 32 lacs/annum

Data Engineering role at ThoughtWorks   ThoughtWorks India is looking for talented data engineers passionate about building large scale data processing systems to help manage the ever-growing information needs of our clients. Our developers have been contributing code to major organizations and open source projects for over 25 years now. They’ve also been writing books, speaking at conferences, and helping push software development forward -- changing companies and even industries along the way. As Consultants, we work with our clients to ensure we’re delivering the best possible solution. Our Lead Dev plays an important role in leading these projects to success.   You will be responsible for - Creating complex data processing pipelines, as part of diverse, high energy teams Designing scalable implementations of the models developed by our Data Scientists Hands-on programming based on TDD, usually in a pair programming environment Deploying data pipelines in production based on Continuous Delivery practices   Ideally, you should have -  2-6 years of overall industry experience Minimum of 2 years of experience building and deploying large scale data processing pipelines in a production environment Strong domain modelling and coding experience in Java /Scala / Python. Experience building data pipelines and data centric applications using distributed storage platforms like HDFS, S3, NoSql databases (Hbase, Cassandra, etc) and distributed processing platforms like Hadoop, Spark, Hive, Oozie, Airflow, Kafka etc in a production setting Hands on experience in (at least one or more) MapR, Cloudera, Hortonworks and/or Cloud (AWS EMR, Azure HDInsights, Qubole etc.) Knowledge of software best practices like Test-Driven Development (TDD) and Continuous Integration (CI), Agile development Strong communication skills with the ability to work in a consulting environment is essential   And here’s some of the perks of being part of a unique organization like ThoughtWorks: A real commitment to “changing the face of IT” -- our way of thinking about diversity and inclusion. Over the past ten years, we’ve implemented a lot of initiatives to make ThoughtWorks a place that reflects the world around us, and to make this a welcoming home to technologists of all stripes. We’re not perfect, but we’re actively working towards true gender balance for our business and our industry, and you’ll see that diversity reflected on our project teams and in offices. Continuous learning. You’ll be constantly exposed to new languages, frameworks and ideas from your peers and as you work on different projects -- challenging you to stay at the top of your game. Support to grow as a technologist outside of your role at ThoughtWorks. This is why ThoughtWorkers have written over 100 books and can be found speaking at (and, ahem, keynoting) tech conferences all over the world. We love to learn and share knowledge, and you’ll find a community of passionate technologists eager to back your endeavors, whatever they may be. You’ll also receive financial support to attend conferences every year. An organizational commitment to social responsibility. ThoughtWorkers challenge each other to be just a little more thoughtful about the world around us, and we believe in using our profits for good. All around the world, you’ll find ThoughtWorks supporting great causes and organizations in both official and unofficial capacities.   If you relish the idea of being part of ThoughtWorks’ Data Practice that extends beyond the work we do for our customers, you may find ThoughtWorks is the right place for you. If you share our passion for technology and want to help change the world with software, we want to hear from you!

Job posted by
apply for job
apply for job
Suresh Teegireddy picture
Suresh Teegireddy
Job posted by
Suresh Teegireddy picture
Suresh Teegireddy
Apply for job
apply for job

Sr. Data Analyst
Sr. Data Analyst

Founded 2010
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 5 years
Experience icon
Best in industry7 - 15 lacs/annum

This requirement is to service a leading big data technology company that measures what matters to make cross-platform audiences and advertising more valuable.  We are seeking a Senior Data Analyst to mine our large-scale data sets (2 Trillion + records/month) and provide insights to help inform our clients strategy for launching and analyzing marketing campaigns around the world.  This analyst will be responsible for data analysis, design, and development of Cross Platform audience measurement products. Their work will have a direct impact on driving business strategies for prominent industry leaders.  Self-motivation and strong communication skills are both must haves. You will need to be comfortable working in a fast-paced work environment with shifting priorities, vague requirements, and rapid iterations. You will need to be comfortable taking risks.   Problems we are solving:  What, where, and when do people watch video content? Which sites and platforms are changing the behavior of consumer video consumption? How and where is traditional TV viewing shifting towards internet connected devices? What is the total audience for publishers with fragmented content across TV, computer, and mobile?   How do we build flexible processes to handle the various data sources and reporting needs?   About our team:  We’re a small but powerful team of data junkies, analyzing over 20 TB of data each day to deliver product and research solutions to our clients. We use Scala, Spark, SQL, Hadoop, Python, and many other tools. We work with Product Management on what problems to tackle and collaborate with Data Science and Core Processing Engineering teams to create solutions.   Duties and responsibilities: Design architecture and prototype solutions for data processing and methodology Drive analytical projects that span across multiple teams and functions Build, test new features and concepts and integrate into production process Participate in ongoing research and evaluation of new technologies Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients     Qualifications: Bachelors degree in Data Science, Engineering, Computer Science or Information Systems 5 years relevant work experience in related field Experience in data analysis and problem solving with big data Experience with Scala, AWS, SQL, Hive, Spark, R or Python and deep knowledge of relational databases and methods for efficiently retrieving data Ability to think creatively and solve complex problems Ability to autonomously manage simultaneous projects in a fast paced business environment Excellent verbal, written and computer communication skills with strong analytical and troubleshooting skills Ability to consistently meet data expectations; holds team and self-accountable Ability to manage change, course correct, and respond decisively Ability to engage with Senior Leaders across all functional departments Ability to take on new responsibilities; adapts to change and executes

Job posted by
apply for job
apply for job
Aditya Roongta picture
Aditya Roongta
Job posted by
Aditya Roongta picture
Aditya Roongta
Apply for job
apply for job

Data warehouse Architect
Data warehouse Architect

Founded 2006
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
7 - 15 years
Experience icon
Best in industry35 - 40 lacs/annum

1. KEY OBJECTIVE OF THE JOB To closely work with various users, product management team and the tech team to design, develop and strategize Data Architecture and multidimensional databases. 2. MAJOR DELIVERABLES: • Design end-to-end BI and Analytics platform and present to tech and business stakeholders • Evaluate multiple tools and conduct Proofs-of-Concept of the same basis requirements and budgets • Ability to perform Dimensional Modelling of multiple data-marts and enterprise data warehouse from scratch • Understand complex OLTP (Online Transaction Processing) systems such as Order Booking, CRM, Finance, Web etc. and map schemas and data dictionaries from them • Understand business rules around data entities and document them • Map the business rules and OLTP entities to a dimensional model spread across multiple data marts and warehouses • Design a robust and failsafe ETL (Extract, Transform & Load) process without relying on any tool • Operationalise the ETL using shell and SQL scripts without the need for any tool • Operationalise the dimensional model and the warehousing architecture using simple standalone databases like MySQL and Postgres on Linux, or on Cloud based systems like Redshift etc. • Model data lakes for lightly structured but highly voluminous clickstream data using Hadoop and similar technologies • Extremely hands-on person who loves to create a blueprint as well as write scripts, make presentations and even setup end-to-end PoCs (Proof of Concepts) on his/her own • Coordinate among Data Scientists, Technology Partners, Business Users, Analysts etc, and make sure they are able to use the OLAP (Online Analytical Processing) in the intended way • Understand the pain points of the above stakeholders and continuously iterate the existing platform with a completely open mind to meet their needs. • Track and Continuously tune the data infrastructure for performance and scale 3. RIGHT PERSON : Essential Attributes • Dimensional Modelling and Schema Design for OLAP/BI • Command over Multiple ETL, DW/Data mart and BI tools • Experience on HANA, TALEND will be of added advantage • Solution Design and Documentation • Big Data Architecture Designing ( HADOOP and related ecosystem) • Propensity towards Hands-On/Start-up working environment Desirable Attributes • Big Data and Machine Learning • Data Science and Statistics • Ecommerce or Retail domain experience Profile An engineering and a tech enthusiast, with total experience of atleast 10 years with 5 to 6 years of experience in data warehouse architecture with an ability to think logical, ability to address issues related to data migration, understands the importance of data dictionaries and has strong desire to establish best practices will fit the bill.

Job posted by
apply for job
apply for job
Mitali Jain picture
Mitali Jain
Job posted by
Mitali Jain picture
Mitali Jain
Apply for job
apply for job

Tech Lead
Tech Lead

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 10 years
Experience icon
Best in industry8 - 20 lacs/annum

Job Title: Technology Lead Responsibilities We are looking for a Technology Lead who can drive innovation and take ownership and deliver results. • Own one or more modules of the project under development • Conduct system wide requirement analysis. • Quality, on time delivery of agreed deliverables. • Mentor junior team members • Flexible in working under changing and different work settings. • Contribute to the company knowledge base and process improvements. • Participate in SDLC • Design and implement automated unit testing framework as required • Use best practices and coding standards. • Conduct peer reviews and lead reviews and provide feedback • Develop, maintain, troubleshoot, enhance and document components developed by self and others as per the requirements and detailed design Qualifications • Excellent programming experience of 5 to 10 years in Ruby and Ruby on Rails • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Exposure to front-end technologies like HTML, CSS, Javascript as well as JS libraries/frameworks like JQuery, Angular, React etc. is a strong plus • Exposure to DevOps on AWS is a strong plus Compensation Best in the industry Job Location: Mumbai

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Big Data Developer
Big Data Developer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
2 - 4 years
Experience icon
Best in industry6 - 15 lacs/annum

Job Title: Software Developer – Big Data Responsibilities We are looking for a Big Data Developer who can drive innovation and take ownership and deliver results. • Understand business requirements from stakeholders • Build & own Mintifi Big Data applications • Be heavily involved in every step of the product development process, from ideation to implementation to release. • Design and build systems with automated instrumentation and monitoring • Write unit & integration tests • Collaborate with cross functional teams to validate and get feedback on the efficacy of results created by the big data applications. Use the feedback to improve the business logic • Proactive approach to turn ambiguous problem spaces into clear design solutions. Qualifications • Hands-on programming skills in Apache Spark using Java or Scala • Good understanding about Data Structures and Algorithms • Good understanding about relational and non-relational database concepts (MySQL, Hadoop, MongoDB) • Experience in Hadoop ecosystem components like YARN, Zookeeper would be a strong plus

Job posted by
apply for job
apply for job
Suchita Upadhyay picture
Suchita Upadhyay
Job posted by
Suchita Upadhyay picture
Suchita Upadhyay
Apply for job
apply for job

Data Engineer
Data Engineer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
NCR (Delhi | Gurgaon | Noida), Mumbai
Experience icon
2 - 7 years
Experience icon
Best in industry10 - 30 lacs/annum

About the job: - You will work with data scientists to architect, code and deploy ML models - You will solve problems of storing and analyzing large scale data in milliseconds - architect and develop data processing and warehouse systems - You will code, drink, breathe and live python, sklearn and pandas. It’s good to have experience in these but not a necessity - as long as you’re super comfortable in a language of your choice. - You will develop tools and products that provide analysts ready access to the data About you: - Strong CS fundamentals - You have strong experience in working with production environments - You write code that is clean, readable and tested - Instead of doing it second time, you automate it - You have worked with some of the commonly used databases and computing frameworks (Psql, S3, Hadoop, Hive, Presto, Spark, etc) - It will be great if you have one of the following to share - a kaggle or a github profile - You are an expert in one or more programming languages (Python preferred). Also good to have experience with python-based application development and data science libraries. - Ideally, you have 2+ years of experience in tech and/or data. - Degree in CS/Maths from Tier-1 institutes.

Job posted by
apply for job
apply for job
Pragya Singh picture
Pragya Singh
Job posted by
Pragya Singh picture
Pragya Singh
Apply for job
apply for job

Hadoop Developer
Hadoop Developer

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
3 - 20+ years
Experience icon
Best in industry4 - 15 lacs/annum

Looking for Big data Developers in Mumbai Location

Job posted by
apply for job
apply for job
Sheela P picture
Sheela P
Job posted by
Sheela P picture
Sheela P
Apply for job
apply for job

Technical Architect/CTO
Technical Architect/CTO

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[1 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai
Experience icon
5 - 11 years
Experience icon
Best in industry15 - 30 lacs/annum

ABOUT US: Arque Capital is a FinTech startup working with AI in Finance in domains like Asset Management (Hedge Funds, ETFs and Structured Products), Robo Advisory, Bespoke Research, Alternate Brokerage, and other applications of Technology & Quantitative methods in Big Finance. PROFILE DESCRIPTION: 1. Get the "Tech" in order for the Hedge Fund - Help answer fundamentals of technology blocks to be used, choice of certain platform/tech over other, helping team visualize product with the available resources and assets 2. Build, manage, and validate a Tech Roadmap for our Products 3. Architecture Practices - At startups, the dynamics changes very fast. Making sure that best practices are defined and followed by team is very important. CTO’s may have to garbage guy and clean the code time to time. Making reviews on Code Quality is an important activity that CTO should follow. 4. Build progressive learning culture and establish predictable model of envisioning, designing and developing products 5. Product Innovation through Research and continuous improvement 6. Build out the Technological Infrastructure for the Hedge Fund 7. Hiring and building out the Technology team 8. Setting up and managing the entire IT infrastructure - Hardware as well as Cloud 9. Ensure company-wide security and IP protection REQUIREMENTS: Computer Science Engineer from Tier-I colleges only (IIT, IIIT, NIT, BITS, DHU, Anna University, MU) 5-10 years of relevant Technology experience (no infra or database persons) Expertise in Python and C++ (3+ years minimum) 2+ years experience of building and managing Big Data projects Experience with technical design & architecture (1+ years minimum) Experience with High performance computing - OPTIONAL Experience as a Tech Lead, IT Manager, Director, VP, or CTO 1+ year Experience managing Cloud computing infrastructure (Amazon AWS preferred) - OPTIONAL Ability to work in an unstructured environment Looking to work in a small, start-up type environment based out of Mumbai COMPENSATION: Co-Founder status and Equity partnership

Job posted by
apply for job
apply for job
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Job posted by
Hrishabh Sanghvi picture
Hrishabh Sanghvi
Apply for job
apply for job

Freelance Faculty
Freelance Faculty

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere, United States, Canada
Experience icon
3 - 10 years
Experience icon
Best in industry2 - 10 lacs/annum

To introduce myself I head Global Faculty Acquisition for Simplilearn. About My Company: SIMPLILEARN is a company which has transformed 500,000+ carriers across 150+ countries with 400+ courses and yes we are a Registered Professional Education Provider providing PMI-PMP, PRINCE2, ITIL (Foundation, Intermediate & Expert), MSP, COBIT, Six Sigma (GB, BB & Lean Management), Financial Modeling with MS Excel, CSM, PMI - ACP, RMP, CISSP, CTFL, CISA, CFA Level 1, CCNA, CCNP, Big Data Hadoop, CBAP, iOS, TOGAF, Tableau, Digital Marketing, Data scientist with Python, Data Science with SAS & Excel, Big Data Hadoop Developer & Administrator, Apache Spark and Scala, Tableau Desktop 9, Agile Scrum Master, Salesforce Platform Developer, Azure & Google Cloud. : Our Official website : www.simplilearn.com If you're interested in teaching, interacting, sharing real life experiences and passion to transform Careers, please join hands with us. Onboarding Process • Updated CV needs to be sent to my email id , with relevant certificate copy. • Sample ELearning access will be shared with 15days trail post your registration in our website. • My Subject Matter Expert will evaluate you on your areas of expertise over a telephonic conversation - Duration 15 to 20 minutes • Commercial Discussion. • We will register you to our on-going online session to introduce you to our course content and the Simplilearn style of teaching. • A Demo will be conducted to check your training style, Internet connectivity. • Freelancer Master Service Agreement Payment Process : • Once a workshop/ Last day of the training for the batch is completed you have to share your invoice. • An automated Tracking Id will be shared from our automated ticketing system. • Our Faculty group will verify the details provided and share the invoice to our internal finance team to process your payment, if there are any additional information required we will co-ordinate with you. • Payment will be processed in 15 working days as per the policy this 15 days is from the date of invoice received. Please share your updated CV to get this for next step of on-boarding process.

Job posted by
apply for job
apply for job
STEVEN JOHN picture
STEVEN JOHN
Job posted by
STEVEN JOHN picture
STEVEN JOHN
Apply for job
apply for job

Data Scientist
Data Scientist

Founded 2014
Products and services{{j_company_types[2 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Navi Mumbai
Experience icon
4 - 8 years
Experience icon
Best in industry5 - 15 lacs/annum

Nextalytics is an offshore research, development and consulting company based in India that focuses on high quality and cost effective software development and data science solutions. At Nextalytics, we have developed a culture that encourages employees to be creative, innovative, and playful. We reward intelligence, dedication and out-of-the-box thinking; if you have these, Nextalytics will be the perfect launch pad for your dreams. Nextalytics is looking for smart, driven and energetic new team members.

Job posted by
apply for job
apply for job
Harshal Patni picture
Harshal Patni
Job posted by
Harshal Patni picture
Harshal Patni
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No spam.