Cutshort logo

4+ HDFS Jobs in Mumbai | HDFS Job openings in Mumbai

Apply to 4+ HDFS Jobs in Mumbai on CutShort.io. Explore the latest HDFS Job opportunities across top companies like Google, Amazon & Adobe.

icon
Smartavya Analytica

Smartavya Analytica

Agency job
via Pluginlive by Joslyn Gomes
Mumbai
12 - 15 yrs
₹30L - ₹35L / yr
Hadoop
Cloudera
HDFS
Apache Hive
Apache Impala
+3 more

Experience: 12-15 Years

Key Responsibilities: 

  • Client Engagement & Requirements Gathering: Independently engage with client stakeholders to
  • understand data landscapes and requirements, translating them into functional and technical specifications.
  • Data Architecture & Solution Design: Architect and implement Hadoop-based Cloudera CDP solutions,
  • including data integration, data warehousing, and data lakes.
  • Data Processes & Governance: Develop data ingestion and ETL/ELT frameworks, ensuring robust data governance and quality practices.
  • Performance Optimization: Provide SQL expertise and optimize Hadoop ecosystems (HDFS, Ozone, Kudu, Spark Streaming, etc.) for maximum performance.
  • Coding & Development: Hands-on coding in relevant technologies and frameworks, ensuring project deliverables meet stringent quality and performance standards.
  • API & Database Management: Integrate APIs and manage databases (e.g., PostgreSQL, Oracle) to support seamless data flows.
  • Leadership & Mentoring: Guide and mentor a team of data engineers and analysts, fostering collaboration and technical excellence.

Skills Required:

  • a. Technical Proficiency:
  • • Extensive experience with Hadoop ecosystem tools and services (HDFS, YARN, Cloudera
  • Manager, Impala, Kudu, Hive, Spark Streaming, etc.).
  • • Proficiency in programming languages like Spark, Python, Scala and a strong grasp of SQL
  • performance tuning.
  • • ETL tool expertise (e.g., Informatica, Talend, Apache Nifi) and data modelling knowledge.
  • • API integration skills for effective data flow management.
  • b. Project Management & Communication:
  • • Proven ability to lead large-scale data projects and manage project timelines.
  • • Excellent communication, presentation, and critical thinking skills.
  • c. Client & Team Leadership:
  • • Engage effectively with clients and partners, leading onsite and offshore teams.


Read more
Smartavya

Smartavya

Agency job
via Pluginlive by Harsha Saggi
Mumbai
10 - 18 yrs
₹35L - ₹40L / yr
Hadoop
Architecture
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
PySpark
+13 more
  • Architectural Leadership:
  • Design and architect robust, scalable, and high-performance Hadoop solutions.
  • Define and implement data architecture strategies, standards, and processes.
  • Collaborate with senior leadership to align data strategies with business goals.
  • Technical Expertise:
  • Develop and maintain complex data processing systems using Hadoop and its ecosystem (HDFS, YARN, MapReduce, Hive, HBase, Pig, etc.).
  • Ensure optimal performance and scalability of Hadoop clusters.
  • Oversee the integration of Hadoop solutions with existing data systems and third-party applications.
  • Strategic Planning:
  • Develop long-term plans for data architecture, considering emerging technologies and future trends.
  • Evaluate and recommend new technologies and tools to enhance the Hadoop ecosystem.
  • Lead the adoption of big data best practices and methodologies.
  • Team Leadership and Collaboration:
  • Mentor and guide data engineers and developers, fostering a culture of continuous improvement.
  • Work closely with data scientists, analysts, and other stakeholders to understand requirements and deliver high-quality solutions.
  • Ensure effective communication and collaboration across all teams involved in data projects.
  • Project Management:
  • Lead large-scale data projects from inception to completion, ensuring timely delivery and high quality.
  • Manage project resources, budgets, and timelines effectively.
  • Monitor project progress and address any issues or risks promptly.
  • Data Governance and Security:
  • Implement robust data governance policies and procedures to ensure data quality and compliance.
  • Ensure data security and privacy by implementing appropriate measures and controls.
  • Conduct regular audits and reviews of data systems to ensure compliance with industry standards and regulations.
Read more
Pion Global Solutions LTD
Sheela P
Posted by Sheela P
Mumbai
3 - 100 yrs
₹4L - ₹15L / yr
Spark
Big Data
Hadoop
HDFS
Apache Sqoop
+2 more
Looking for Big data Developers in Mumbai Location
Read more
Accion Labs

at Accion Labs

14 recruiters
Neha Mayekar
Posted by Neha Mayekar
Mumbai
5 - 14 yrs
₹8L - ₹18L / yr
HDFS
Hbase
Spark
Flume
hive
+2 more
US based Multinational Company Hands on Hadoop
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort