4+ Zookeeper Jobs in India
Apply to 4+ Zookeeper Jobs on CutShort.io. Find your next job, effortlessly. Browse Zookeeper Jobs and apply today!
As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.
- Develop and recommend improvements to standard and moderately complex application support processes and procedures.
- Review, analyze and prioritize incoming incident tickets and user requests.
- Perform programming, configuration, testing and deployment of fixes or updates for application version releases.
- Implement security processes to protect data integrity and ensure regulatory compliance.
- Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs.
MINIMUM QUALIFICATIONS
- 2-4 year of minimum experience
- Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
- Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
- Experience implementing Kerberos security
- Preferred:
- Experience in Linux system administration
- Authentication plugin experience such as basic, SSL, and Kerberos
- Production incident support including root cause analysis
- AWS EC2
- Terraform
· IMMEDIATE JOINER
Professional Experience with 5+ years in Confluent Kafka Admin
· Demonstrated experience design / development.
· Must have proven knowledge and practical application of – Confluent Kafka (Producers/ Consumers / Kafka Connectors / Kafka Stream/ksqlDB/Schema Registry)
· Experience in performance optimization of consumers, producers.
· Good experience debugging issues related offset, consumer lag, partitions.
· Experience with Administrative tasks on Confluent Kafka.
· Kafka admin experience including but not limited to setup new Kafka cluster, create topics, grant permissions, offset reset, purge data, setup connectors, setup replicator task, troubleshooting issues, Monitor Kafka cluster health and performance, backup and recovery.
· Experience in implementing security measures for Kafka clusters, including access controls and encryption, to protect sensitive data.
· Install/Upgrade Kafka cluster techniques.
· Good experience with writing unit tests using Junit and Mockito
· Have experience with working in client facing project.
· Exposure to any cloud environment like AZURE is added advantage.
· Experience in developing or working on REST Microservices
Experience in Java, Springboot is a plus
Responsibilities :
- Provide Support Services to our Gold & Enterprise customers using our flagship product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
- Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
- Lead and mentor others about concurrency, parallelization to deliver scalability, performance, and resource optimization in a multithreaded and distributed environment
- Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products
Requires Skills :
- 10+ years of Experience with a highly scalable, distributed, multi-node environment (100+ nodes)
- Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
- Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
- Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
- Linux, NFS, Windows, including application installation, scripting, basic command line
- Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
- Experience working with scripting languages (Bash, PowerShell, Python)
- Working knowledge of application, server, and network security management concepts
- Familiarity with virtual machine technologies
- Knowledge of databases like MySQL and PostgreSQL,
- Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
Top Level 5 Services Company
- Design and develop innovative, company impacting products and services to support infrastructure operations
- Design, develop and implement object-oriented PHP applications from prototype through implementation
- Integrate open source and commercial enterprise applications into an exposed API and web-based portal
- Create highly scalable and performant REST/SOAP web services
- Keep focus on end users and goals all through the development process
- Work closely with product management and stakeholders to ensure applications meet needs and expectations
- Adhere to the highest levels of technical discipline and excellence to set a standard for the larger development organization
Requirements
- Bachelor's Degree in Computer Science, related field, or comparable extra work experience
- Solid experience with SQL and relation databases
- Solid experience with Object Oriented Design and Development
- Experience Developing API Interfaces
- Experience with version control systems, preferably Git
Beneficial Skills
- Experience with non-relational data stores such as ZooKeeper or Memcache