Cutshort logo
Hortonworks Jobs in Bangalore (Bengaluru)

2+ Hortonworks Jobs in Bangalore (Bengaluru) | Hortonworks Job openings in Bangalore (Bengaluru)

Apply to 2+ Hortonworks Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Hortonworks Job opportunities across top companies like Google, Amazon & Adobe.

icon
US Based Product Organization
Bengaluru (Bangalore)
10 - 15 yrs
₹25L - ₹45L / yr
Hadoop
HDFS
Apache Hive
Zookeeper
Cloudera
+8 more

Responsibilities :

  • Provide Support Services to our Gold & Enterprise customers using our flagship product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
  • Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
  • Lead and mentor others about concurrency, parallelization to deliver scalability, performance, and resource optimization in a multithreaded and distributed environment
  • Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products


Requires Skills :

  • 10+ years of Experience with a highly scalable, distributed, multi-node environment (100+ nodes)
  • Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
  • Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
  • Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
  • Linux, NFS, Windows, including application installation, scripting, basic command line
  • Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
  • Experience working with scripting languages (Bash, PowerShell, Python)
  • Working knowledge of application, server, and network security management concepts
  • Familiarity with virtual machine technologies
  • Knowledge of databases like MySQL and PostgreSQL,
  • Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
Read more
MNC

at MNC

Agency job
via Fragma Data Systems by Harpreet kour
Bengaluru (Bangalore)
5 - 9 yrs
₹16L - ₹20L / yr
Apache Hadoop
Hadoop
Apache Hive
HDFS
SSL
+1 more
  • Responsibilities
         - Responsible for implementation and ongoing administration of Hadoop
    infrastructure.
         - Aligning with the systems engineering team to propose and deploy new
    hardware and software environments required for Hadoop and to expand existing
    environments.
         - Working with data delivery teams to setup new Hadoop users. This job includes
    setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig
    and MapReduce access for the new users.
         - Cluster maintenance as well as creation and removal of nodes using tools like
    Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools
         - Performance tuning of Hadoop clusters and Hadoop MapReduce routines
         - Screen Hadoop cluster job performances and capacity planning
         - Monitor Hadoop cluster connectivity and security
         - Manage and review Hadoop log files.
         - File system management and monitoring.
         - Diligently teaming with the infrastructure, network, database, application and
    business intelligence teams to guarantee high data quality and availability
         - Collaboration with application teams to install operating system and Hadoop
    updates, patches, version upgrades when required.
        
    READ MORE OF THE JOB DESCRIPTION 
    Qualifications
    Qualifications
         - Bachelors Degree in Information Technology, Computer Science or other
    relevant fields
         - General operational expertise such as good troubleshooting skills,
    understanding of systems capacity, bottlenecks, basics of memory, CPU, OS,
    storage, and networks.
         - Hadoop skills like HBase, Hive, Pig, Mahout
         - Ability to deploy Hadoop cluster, add and remove nodes, keep track of jobs,
    monitor critical parts of the cluster, configure name node high availability, schedule
    and configure it and take backups.
         - Good knowledge of Linux as Hadoop runs on Linux.
         - Familiarity with open source configuration management and deployment tools
    such as Puppet or Chef and Linux scripting.
         Nice to Have
         - Knowledge of Troubleshooting Core Java Applications is a plus.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort