Cutshort logo
Persistent System Ltd logo
Hadoop Developer
Persistent System Ltd
Persistent System Ltd's logo

Hadoop Developer

at Persistent System Ltd

4 - 6 yrs
₹6L - ₹22L / yr
Bengaluru (Bangalore), Pune, Hyderabad
Skills
Apache HBase
Apache Hive
Apache Spark
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
skill iconRuby
skill iconPython
skill iconJava
Hadoop
Spark
Urgently require Hadoop Developer in reputed MNC company

Location: Bangalore/Pune/Hyderabad/Nagpur

4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development,  Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of  using Python/Perl/Shell

 

Please note - Hbase hive and spark are must.

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos

Similar jobs

OnActive
Mansi Gupta
Posted by Mansi Gupta
Gurugram, Pune, Bengaluru (Bangalore), Chennai, Bhopal, Hyderabad, Jaipur
5 - 8 yrs
₹6L - ₹12L / yr
skill iconPython
Spark
SQL
AWS CloudFormation
skill iconMachine Learning (ML)
+3 more

Level of skills and experience:


5 years of hands-on experience in using Python, Spark,Sql.

Experienced in AWS Cloud usage and management.

Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow).

Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch.

Experience with orchestrators such as Airflow and Kubeflow.

Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).

Fundamental understanding of Parquet, Delta Lake and other data file formats.

Proficiency on an IaC tool such as Terraform, CDK or CloudFormation.

Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst

Read more
Helps with software development
Helps with software development
Agency job
via Qrata by Rayal Rajan
Pune
3 - 6 yrs
₹15L - ₹25L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Design patterns
+8 more

Requirements

• Extensive and expert programming experience in at least one general programming language (e. g.

Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.

• Experience with multi-threading and concurrency programming.

• Extensive experience in object oriented design skills, knowledge of design patterns, and a huge passion

and ability to design intuitive modules and class-level interfaces.

• Excellent coding skills - should be able to convert design into code fluently.

• Knowledge of Test Driven Development.

• Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).

• Strong desire to solve complex and interesting real world problems.

• Experience with full life cycle development in any programming language on a Linux platform.

• Go-getter attitude that reflects in energy and intent behind assigned tasks.

• Worked in a startup-like environment with high levels of ownership and commitment.

• BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).

• Experience in building highly scalable business applications, which involve implementing large complex

business flows and dealing with huge amounts of data.

• 3+ years of experience in the art of writing code and solving problems on a large scale.

• Open communicator who shares thoughts and opinions frequently, listens intently, and takes

constructive feedback.

Read more
Building the world's largest search intelligence products.
Building the world's largest search intelligence products.
Agency job
via Qrata by Prajakta Kulkarni
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹18L / yr
skill iconJava
skill iconPython
skill iconMachine Learning (ML)
XSD
skill iconXML
+10 more

About the Role-

Thinking big and executing beyond what is expected. The challenges cut across algorithmic problem solving, systems engineering, machine learning and infrastructure at a massive scale.

Reason to Join-

An opportunity for innovators, problem solvers & learners.  Working will be Innovative, empowering, rewarding & fun. Amazing Office, competitive pay along with excellent benefits package.

 

Requiremets and Responsibilities- (please read carefully before applying)

  • The overall experience of 3-6 years in Java/Python Framework and Machine Learning.
  • Develop Web Services, REST, XSD, XML technologies, Java, Python, AWS, API.
  • Experience on Elastic Search or SOLR or Lucene -Search Engine, Text Mining, Indexing.
  • Experience in highly scalable tools like Kafka, Spark, Aerospike, etc.
  • Hands on experience in Design, Architecture, Implementation, Performance & Scalability, and Distributed Systems.
  • Design, implement, and deploy highly scalable and reliable systems.
  • Troubleshoot Solr indexing process and querying engine.
  • Bachelors or Masters in Computer Science from Tier 1 Institutions
Read more
Cloudera
at Cloudera
2 recruiters
Rahamath Mallick
Posted by Rahamath Mallick
Remote only
6 - 10 yrs
₹26L - ₹40L / yr
skill iconJava
skill iconKubernetes
Relational Database (RDBMS)
Data Structures
Spark
+1 more

Cloudera Data Warehouse Hive team looking for a passionate senior developer to join our growing engineering team. This group is targeting the biggest enterprises wanting to utilize Cloudera’s services in a private and public cloud environment. Our product is built on open source technologies like Hive, Impala, Hadoop, Kudu, Spark and so many more providing unlimited learning opportunities. 

 

A Day in the Life

 

Over the past 10+ years, Cloudera has experienced tremendous growth making us the leading contributor to Big Data platforms and ecosystems and a leading provider for enterprise solutions based on Apache Hadoop. You will work with some of the best engineers in the industry who are tackling challenges that will continue to shape the Big Data revolution.  We foster an engaging, supportive, and productive work environment where you can do your best work. The team culture values engineering excellence, technical depth, grassroots innovation, teamwork, and collaboration.

You will manage product development for our CDP components, develop engineering tools and scalable services to enable efficient development, testing, and release operations.  You will be immersed in many exciting, cutting-edge technologies and projects, including collaboration with developers, testers, product, field engineers, and our external partners, both software and hardware vendors.

 

Opportunity:

 

Cloudera is a leader in the fast-growing big data platforms market. This is a rare chance to make a name for yourself in the industry and in the Open Source world. The candidate will responsible for Apache Hive and CDW projects. We are looking for a candidate who would like to work on these projects upstream and downstream. If you are curious about the project and code quality you can check the project and the code at the following link. You can start the development before you join. This is one of the beauties of the OSS world. 

 

https://hive.apache.org/" target="_blank">Apache Hive

 

Responsibilities:

  • Build robust and scalable data infrastructure software

  • Design and create services and system architecture for your projects 

  • Improve code quality through writing unit tests, automation, and code reviews

  • The candidate would write Java code and/or build several services in the Cloudera Data Warehouse. 

  • Worked with a team of engineers who reviewed each other's code/designs and held each other to an extremely high bar for the quality of code/designs

  • The candidate has to understand the basics of Kubernetes.

  • Build out the production and test infrastructure. 

  • Develop automation frameworks to reproduce issues and prevent regressions.

  • Work closely with other developers providing services to our system.

  • Help to analyze and to understand how customers use the product and improve it where necessary.  

 

Qualifications:

  • Deep familiarity with Java programming language.

  • Hands-on experience with distributed systems. 

  • Knowledge of database concepts, RDBMS internals.

  • Knowledge of the Hadoop stack, containers, or Kubernetes is a strong plus.  

  • Has experience working in a distributed team. 

  • Has 3+ years of experience in software development.

Read more
Technology service company
Technology service company
Agency job
via Jobdost by Riya Roy
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Ansible
+11 more
  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

  • Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
    ▪ Distributed Cloud Native Computing including Server less Functions
    ▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
    ▪ Micro services Architecture, API Modeling, Design, & Programming

  • 3+ years of hands-on development experience in Apache Spark using Scala and/or Java.

  • Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.

  • In-depth knowledge of standard programming languages such as Scala and/or Java.

  • 3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.

  • 3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.

  • Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.

  • Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.

  • Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.

  • Perform benchmarking/stress tests and document the best practices for different applications.

  • Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.

  • Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.

  • Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.

    Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.

  • Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.

  • Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.

  • Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.

Read more
Verismo Solutions LLC
Verismo Solutions LLC
Agency job
Remote only
4 - 15 yrs
₹8L - ₹20L / yr
skill iconPython
Perl
PL/SQL
Apache Hive
Hadoop
+1 more

Looking for Part time candidate job support having good skills in python, hadoop, oracle and perl .

Working hours - 2-3 hrs daily work for 1year payment from 200-700 whatsapp +1 mad C00  Vwxe your details

Read more
Securonix
at Securonix
1 recruiter
Ramakrishna Murthy
Posted by Ramakrishna Murthy
Pune
3 - 7 yrs
₹10L - ₹15L / yr
HDFS
Apache Flume
Apache HBase
Hadoop
Impala
+3 more
Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.
Read more
Healofy
at Healofy
3 recruiters
Shubham Maheshwari
Posted by Shubham Maheshwari
Bengaluru (Bangalore)
1 - 7 yrs
₹15L - ₹40L / yr
skill iconJava
Google App Engine (GAE)
Apache Kafka
NOSQL Databases
Firebase
+3 more
RESPONSIBILITIES: 1. Full ownership of Tech right from driving product decisions to architect to deployment. 2. Develop cutting edge user experience and build cutting edge technology solutions like instant messaging in poor networks, live-discussions, live-videos optimal matching. 3. Using Billions of Data Points to Build User Personalisation Engine 4. Building Data Network Effects Engine to increase Engagement & Virality 5. Scaling the Systems to Billions of Daily Hits. 6. Deep diving into performance, power management, memory optimisation & network connectivity optimisation for the next Billion Indians 7. Orchestrating complicated workflows, asynchronous actions, and higher order components 8. Work directly with Product and Design teams REQUIREMENTS: 1. Should have Hacked some (computer or non-computer) system to your advantage. 2. Built and managed systems with a scale of 10Mn+ Daily Hits 3. Strong architectural experience 4. Strong experience in memory management, performance tuning and resource optimisations 5. PREFERENCE- If you are a woman or an ex-entrepreneur or having a CS bachelor’s degree from IIT/BITS/NIT P.S. If you don't fulfil one of the requirements, you need to be exceptional in the others to be considered.
Read more
auzmor
at auzmor
5 recruiters
Loga B
Posted by Loga B
Chennai
3 - 10 yrs
₹10L - ₹30L / yr
skill iconJava
skill iconReact.js
skill iconAngularJS (1.x)
Selenium Web driver
Hadoop
+3 more
Description Auzmor is US HQ’ed, funded SaaS startup focussed on disrupting the HR space. We combine passion, domain expertise and build products with focus on great end user experiences We are looking for Technical Architect to envision, build, launch and scale multiple SaaS products What You Will Do: • Understand the broader strategy, business goals, and engineering priorities of the company and how to incorporate them into your designs of systems, components, or features • Designing applications and architectures for multi-tenant SaaS software • Responsible for the selection and use of frameworks, platforms and design patterns for Cloud based multi-tenant SaaS based application • Collaborate with engineers, QA, product managers, UX designers, partners/vendors, and other architects to build scalable systems, services, and products for our diverse ecosystem of users across apps What you will need • Minimum of 5+ years of Hands on engineering experience in SaaS, Cloud services environments with architecture design and definition experience using Java/JEE, Struts, Spring, JMS & ORM (Hibernate, JPA) or other Server side technologies, frameworks. • Strong understanding of architecture patterns such as multi-tenancy, scalability, and federation, microservices(design, decomposition, and maintenance ) to build cloud-ready systems • Experience with server-side technologies (preferably Java or Go),frontend technologies (HTML/CSS, Native JS, React, Angular, etc.) and testing frameworks and automation (PHPUnit, Codeception, Behat, Selenium, webdriver, etc.) • Passion for quality and engineering excellence at scale What we would love to see • Exposure to Big data -related technologies such as Hadoop, Spark, Cassandra, Mapreduce or NoSQL, and data management, data retrieval , data quality , ETL, data analysis. • Familiarity with containerized deployments and cloud computing platforms (AWS, Azure, GCP)
Read more
WNS Global Services
at WNS Global Services
7 recruiters
Jiten Chanana
Posted by Jiten Chanana
Bengaluru (Bangalore)
4 - 8 yrs
₹12L - ₹25L / yr
0360
skill iconPython
Apache Hive
Big Data
• Good experience in Python and SQL • Plus will be experience in Hive / Presto • Strong skills in using Python / R for building data pipelines and analysis • Good programming background - o Writing efficient and re-usable code o Comfort with working on the CLI and with tools like GitHub etc. Other softer aspects that are important - • Fast learner - No matter how much programming a person has done in the past, willing to learn new tools is the key • An eye for standardization and scalability of processes - the person will not need to do this alone but it will help us for everyone on the team to have this orientation • A generalist mindset - Everyone on the team will need to also work on front-end tools (Tableau and Unidash) so openness to playing a little outside the comfort zone
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Shubham Vishwakarma's profile image

Shubham Vishwakarma

Full Stack Developer - Averlon
I had an amazing experience. It was a delight getting interviewed via Cutshort. The entire end to end process was amazing. I would like to mention Reshika, she was just amazing wrt guiding me through the process. Thank you team.
Companies hiring on Cutshort
companies logos