Cutshort logo

8+ Cloudera Jobs in India

Apply to 8+ Cloudera Jobs on CutShort.io. Find your next job, effortlessly. Browse Cloudera Jobs and apply today!

icon
Smartavya Analytica

Smartavya Analytica

Agency job
via Pluginlive by Joslyn Gomes
Mumbai
12 - 15 yrs
₹30L - ₹35L / yr
Hadoop
Cloudera
HDFS
Apache Hive
Apache Impala
+3 more

Experience: 12-15 Years

Key Responsibilities: 

  • Client Engagement & Requirements Gathering: Independently engage with client stakeholders to
  • understand data landscapes and requirements, translating them into functional and technical specifications.
  • Data Architecture & Solution Design: Architect and implement Hadoop-based Cloudera CDP solutions,
  • including data integration, data warehousing, and data lakes.
  • Data Processes & Governance: Develop data ingestion and ETL/ELT frameworks, ensuring robust data governance and quality practices.
  • Performance Optimization: Provide SQL expertise and optimize Hadoop ecosystems (HDFS, Ozone, Kudu, Spark Streaming, etc.) for maximum performance.
  • Coding & Development: Hands-on coding in relevant technologies and frameworks, ensuring project deliverables meet stringent quality and performance standards.
  • API & Database Management: Integrate APIs and manage databases (e.g., PostgreSQL, Oracle) to support seamless data flows.
  • Leadership & Mentoring: Guide and mentor a team of data engineers and analysts, fostering collaboration and technical excellence.

Skills Required:

  • a. Technical Proficiency:
  • • Extensive experience with Hadoop ecosystem tools and services (HDFS, YARN, Cloudera
  • Manager, Impala, Kudu, Hive, Spark Streaming, etc.).
  • • Proficiency in programming languages like Spark, Python, Scala and a strong grasp of SQL
  • performance tuning.
  • • ETL tool expertise (e.g., Informatica, Talend, Apache Nifi) and data modelling knowledge.
  • • API integration skills for effective data flow management.
  • b. Project Management & Communication:
  • • Proven ability to lead large-scale data projects and manage project timelines.
  • • Excellent communication, presentation, and critical thinking skills.
  • c. Client & Team Leadership:
  • • Engage effectively with clients and partners, leading onsite and offshore teams.


Read more
Smartavya

Smartavya

Agency job
via Pluginlive by Harsha Saggi
Mumbai
8 - 15 yrs
₹20L - ₹28L / yr
Hadoop
Apache Hive
HDFS
Spark
Data cluster
+5 more

Key Responsibilities:


• Install, configure, and maintain Hadoop clusters.

• Monitor cluster performance and ensure high availability.

• Manage Hadoop ecosystem components (HDFS, YARN, Ozone, Spark, Kudu, Hive).

• Perform routine cluster maintenance and troubleshooting.

• Implement and manage security and data governance.

• Monitor systems health and optimize performance.

• Collaborate with cross-functional teams to support big data applications.

• Perform Linux administration tasks and manage system configurations.

• Ensure data integrity and backup procedures.

Read more
US Based Product Organization

US Based Product Organization

Agency job
via e-Hireo by Biswajit Banik
Bengaluru (Bangalore)
10 - 15 yrs
₹25L - ₹45L / yr
Hadoop
HDFS
Apache Hive
Zookeeper
Cloudera
+8 more

Responsibilities :

  • Provide Support Services to our Gold & Enterprise customers using our flagship product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
  • Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
  • Lead and mentor others about concurrency, parallelization to deliver scalability, performance, and resource optimization in a multithreaded and distributed environment
  • Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products


Requires Skills :

  • 10+ years of Experience with a highly scalable, distributed, multi-node environment (100+ nodes)
  • Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
  • Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
  • Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
  • Linux, NFS, Windows, including application installation, scripting, basic command line
  • Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
  • Experience working with scripting languages (Bash, PowerShell, Python)
  • Working knowledge of application, server, and network security management concepts
  • Familiarity with virtual machine technologies
  • Knowledge of databases like MySQL and PostgreSQL,
  • Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
Read more
one of the leading payments bank

one of the leading payments bank

Agency job
via Mavin RPO Solutions Pvt. Ltd. by kshiteej jagtap
Navi Mumbai
3 - 5 yrs
₹7L - ₹18L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more

Requirements:

  • Proficiency in shell scripting.
  • Proficiency in automation of tasks.
  • Proficiency in Pyspark/Python.
  • Proficiency in writing and understanding of sqoop.
  • Understanding of Cloud Era manager.
  • Good understanding of RDBMS.
  • Good understanding of Excel.
  • Familiarity with Hadoop ecosystem and its components.
  • Understanding of data loading tools such as Flume, Sqoop etc.
  • Ability to write reliable, manageable, and high-performance code.
  • Good knowledge of database principles, practices, structures, and theories.
Read more
Navi Mumbai
3 - 5 yrs
₹7L - ₹18L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more
  • Proficiency in shell scripting
  • Proficiency in automation of tasks
  • Proficiency in Pyspark/Python
  • Proficiency in writing and understanding of sqoop
  • Understanding of CloudEra manager
  • Good understanding of RDBMS
  • Good understanding of Excel

 

Read more
Indium Software

at Indium Software

16 recruiters
Ivarajneasan S K
Posted by Ivarajneasan S K
Chennai
9 - 14 yrs
₹12L - ₹18L / yr
Apache Hadoop
Hadoop
Cloudera
HDFS
MapReduce
+2 more
Deploying a Hadoop cluster, maintaining a hadoop cluster, adding and removing nodes using cluster monitoring tools like Ganglia Nagios or Cloudera Manager, configuring the NameNode high availability and keeping a track of all the running hadoop jobs.

Good understating or hand's on in Kafka Admin / Apache Kafka Streaming.

Implementing, managing, and administering the overall hadoop infrastructure.

Takes care of the day-to-day running of Hadoop clusters

A hadoop administrator will have to work closely with the database team, network team, BI team, and application teams to make sure that all the big data applications are highly available and performing as expected.

If working with open source Apache Distribution, then hadoop admins have to manually setup all the configurations- Core-Site, HDFS-Site, YARN-Site and Map Red-Site. However, when working with popular hadoop distribution like Hortonworks, Cloudera or MapR the configuration files are setup on startup and the hadoop admin need not configure them manually.

Hadoop admin is responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster.

Hadoop admin is also responsible for deciding the size of the hadoop cluster based on the data to be stored in HDFS.

Ensure that the hadoop cluster is up and running all the time.

Monitoring the cluster connectivity and performance.

Manage and review Hadoop log files.

Backup and recovery tasks

Resource and security management

Troubleshooting application errors and ensuring that they do not occur again.
Read more
FarmGuide

at FarmGuide

1 recruiter
Anupam Arya
Posted by Anupam Arya
NCR (Delhi | Gurgaon | Noida)
0 - 8 yrs
₹7L - ₹14L / yr
Computer Security
Image processing
OpenCV
skill iconPython
Rational ClearCase
+8 more
FarmGuide is a data driven tech startup aiming towards digitizing the periodic processes in place and bringing information symmetry in agriculture supply chain through transparent, dynamic & interactive software solutions. We, at FarmGuide (https://angel.co/farmguide) help Government in relevant and efficient policy making by ensuring seamless flow of information between stakeholders.Job Description :We are looking for individuals who want to help us design cutting edge scalable products to meet our rapidly growing business. We are building out the data science team and looking to hire across levels.- Solving complex problems in the agri-tech sector, which are long-standing open problems at the national level.- Applying computer vision techniques to satellite imagery to deduce artefacts of interest.- Applying various machine learning techniques to digitize existing physical corpus of knowledge in the sector.Key Responsibilities :- Develop computer vision algorithms for production use on satellite and aerial imagery- Implement models and data pipelines to analyse terabytes of data.- Deploy built models in production environment.- Develop tools to assess algorithm accuracy- Implement algorithms at scale in the commercial cloudSkills Required :- B.Tech/ M.Tech in CS or other related fields such as EE or MCA from IIT/NIT/BITS but not compulsory. - Demonstrable interest in Machine Learning and Computer Vision, such as coursework, open-source contribution, etc.- Experience with digital image processing techniques - Familiarity/Experience with geospatial, planetary, or astronomical datasets is valuable- Experience in writing algorithms to manipulate geospatial data- Hands-on knowledge of GDAL or open-source GIS tools is a plus- Familiarity with cloud systems (AWS/Google Cloud) and cloud infrastructure is a plus- Experience with high performance or large scale computing infrastructure might be helpful- Coding ability in R or Python. - Self-directed team player who thrives in a continually changing environmentWhat is on offer :- High impact role in a young start up with colleagues from IITs and other Tier 1 colleges- Chance to work on the cutting edge of ML (yes, we do train Neural Nets on GPUs) - Lots of freedom in terms of the work you do and how you do it - Flexible timings - Best start-up salary in industry with additional tax benefits
Read more
LeadSquared
Bengaluru (Bangalore)
6 - 12 yrs
₹1L - ₹24L / yr
Cloudera
TCP/IP
Customer Relationship Management (CRM)
Genesys
skill icon.NET
+7 more
LeadSquared is looking for Technical Manager to own the cloud telephony and contact center integration product as part of the engineering team at Bangalore. What we do? LeadSquared is a leading sales execution cloud platform used by over 25,000 users worldwide to accelerate revenue generation. Being a fast-growing cloud company, there are tons of exciting software engineering problems we solve that include integration, scalability, performance, automation, big-data analytics and machine learning. It is a great opportunity for engineers who love to solve complex problems and would like to be part of a journey to build world-class software.The Role :Being a sales execution platform, telephony plays a key role. We are seeking an individual who can completely own the integration of LeadSquared with major cloud telephony and contact center platforms. The role involves working closely with customers, partners and senior leadership of the company directly to define and refine a standard way to integrate with contact center platforms. Subsequently, work with a small team to build a high quality, maintainable and secure integration system. The ideal candidate would understand how to design and build applications involving integration with cloud services and will have some experience in contact center application development, CTI or cloud telephony. It is critical to have strong understanding of software engineering practices such as work item management, source control, test driven development and continuous integration. Key Requirements : - Passion for building and delivering great software with a strong sense of ownership. - Minimum of six-years full-stack experience in building cloud-based web applications using .NET or Java.- Experience in releasing software applications and supporting customers- Experience in leading small teams- Strong experience with Build and Release, Agile processes and Estimation/PlanningQuick Facts about LeadSquared :- Top rated product on crowd review site G2crowd- Listed as the best rated marketing automation vendor at TrustRadius - Listed among the top vendors globally at GetApp.- Listed as one of the top 10 Marketing Automation solutions on GetApp.- Featured as No.1 Marketing Automation Software in India by NASSCOM.- Featured twice in Deloitte's Technology Fast 50 India 2014, 2016 and 2017- Ranked as one of the Top 20 Most Popular Marketing Automation Software Solutions by CAPTERRA- Featured in India's Most Promising Startups in NextBigWhat List 2013 100+ Indian Startups- We are among the 14 fantastic startups out of 19,400 in India- We are in Red Herring Asia's Top 100 Finalists - 2014For more details about the product, please visit http://www.leadsquared.comHere are some other online resources that you can look at:https://www.g2crowd.com/products/leadsquared/reviewswww.facebook.com/leadsquaredhttps://www.linkedin.com/company/leadsquared Profiles of founders : http://www.leadsquared.com/teamTEAM TA
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort