Hadoop Developer

at Persistent System Ltd

icon
Bengaluru (Bangalore), Pune, Hyderabad
icon
4 - 6 yrs
icon
₹6L - ₹22L / yr
icon
Full time
Skills
Apache HBase
Apache Hive
Apache Spark
Java
Python
Ruby
Ruby on Rails (ROR)
Go Programming (Golang)
Hadoop
Spark
Urgently require Hadoop Developer in reputed MNC company

Location: Bangalore/Pune/Hyderabad/Nagpur

4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development,  Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of  using Python/Perl/Shell

 

Please note - Hbase hive and spark are must.

Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Search Engineer

at Building the world's largest search intelligence products.

Agency job
via Qrata
Java
Python
Machine Learning (ML)
XSD
XML
Amazon Web Services (AWS)
API
Solr
Lucene
Elastic Search
Apache Kafka
Spark
Aerospike
Text mining
REST
icon
Bengaluru (Bangalore)
icon
3 - 6 yrs
icon
₹8L - ₹18L / yr

About the Role-

Thinking big and executing beyond what is expected. The challenges cut across algorithmic problem solving, systems engineering, machine learning and infrastructure at a massive scale.

Reason to Join-

An opportunity for innovators, problem solvers & learners.  Working will be Innovative, empowering, rewarding & fun. Amazing Office, competitive pay along with excellent benefits package.

 

Requiremets and Responsibilities- (please read carefully before applying)

  • The overall experience of 3-6 years in Java/Python Framework and Machine Learning.
  • Develop Web Services, REST, XSD, XML technologies, Java, Python, AWS, API.
  • Experience on Elastic Search or SOLR or Lucene -Search Engine, Text Mining, Indexing.
  • Experience in highly scalable tools like Kafka, Spark, Aerospike, etc.
  • Hands on experience in Design, Architecture, Implementation, Performance & Scalability, and Distributed Systems.
  • Design, implement, and deploy highly scalable and reliable systems.
  • Troubleshoot Solr indexing process and querying engine.
  • Bachelors or Masters in Computer Science from Tier 1 Institutions
Job posted by
Prajakta Kulkarni

Python Developer

at Account Engagement Platform

Agency job
via Qrata
Python
Django
Flask
RESTful APIs
SOAP
Relational Database (RDBMS)
Linux/Unix
Hadoop
Apache Hive
TensorFlow
icon
Remote only
icon
6 - 8 yrs
icon
₹29L - ₹35L / yr


A small description about the Company.

It is an Account Engagement Platform which helps B2B organizations to achieve predictable revenue growth by putting the power of AI, big data, and machine learning behind every member of the revenue team.

Looking for PYTHON DEVELOPER.

Required qualifications and must have skills 
 Excellent analytical and problem-solving skills 
 Proven-deep-expertise with Python programming (3 to 5 Years of minimum hands-on experience in Python).
 Experience in working with frameworks like Django, Flask, etc.
 Experience with building APIs and services using REST, SOAP, etc.
 Experience with any RDBMS and strong SQL knowledge 
 Comfortable with Unix / Linux command line 
 Object-oriented concepts & design patterns 
Nice to have Skills 
 Familiarity with UI frameworks like AngularJS/ReactJS and Redux with strong focus on usability design
 Experience with Big Data platforms like Hadoop / Hive / Presto 
 Experience with ML/AI frameworks like TensorFlow, H20, etc 
 Used Key Value stores or noSQL databases 
 Good understanding of docker and container platforms like Mesos and Kubernetes 
 Security-first architecture approach 
 Application benchmarking and optimization  
Job posted by
Rayal Rajan

Python Developer

at Inviz Ai Solutions Private Limited

Founded 2019  •  Products & Services  •  20-100 employees  •  Profitable
Cloud Computing
Python
Django
Flask
MongoDB
Cassandra
NOSQL Databases
Amazon SQS
Amazon Web Services (AWS)
Elastic Search
Natural Language Processing (NLP)
Apache Hive
Hadoop
Apache Kafka
icon
Bengaluru (Bangalore)
icon
2 - 5 yrs
icon
₹15L - ₹40L / yr

InViz is Bangalore Based Start-up helping Enterprises simplifying the Search and Discovery experiences for both their end customers as well as their internal users. We use state-of-the-art technologies in Computer Vision, Natural Language Processing, Text Mining, and other ML techniques to extract information/concepts from data of different formats- text, images, videos and make them easily discoverable through simple human-friendly touchpoints.

 

Exp: 4-7 years

What we are looking for:

  • 3+ years of working experience in Advanced Python application development
  • Experience working Cloud-Native Architectures and microservices frameworks in high volume production setup
  • Experience with AsyncIo or other reactive frameworks
  • Experience in working with Cloud- GCP, AWS and familiar with modern DevOps stack
  • Experience working with large scale databases like RDBMS, MongoDB, Cassandra, Elasticsearch, etc
  • Familiarity queuing systems like Kafka/SQS/Kinesis is a plus
  • Familiarity with Data Warehousing (BigQuery/Redshift/Hive)

Roles & Responsibilities

  • Take ownership of a product feature and build it end to end.
  • Troubleshoot issues with applications.
  • Design and build our systems for scalability.
  • Work independently on the projects, but guide engineers in building scalable systems.
  • Understand issues like response time, scalability, asynchronous systems, user engagement, and write code considering these paradigms.
Job posted by
Shridhar Nayak

Sr. Spark Software Engineer

at Technology service company

Agency job
via Jobdost
Java
J2EE
Spring Boot
Hibernate (Java)
Ansible
Git
JIRA
Apache Spark
Spark
Apache Kafka
Microservices
Kubernetes
Terraform
NOSQL Databases
API
Docker
icon
Remote only
icon
5 - 10 yrs
icon
₹10L - ₹20L / yr
  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

  • Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
    ▪ Distributed Cloud Native Computing including Server less Functions
    ▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
    ▪ Micro services Architecture, API Modeling, Design, & Programming

  • 3+ years of hands-on development experience in Apache Spark using Scala and/or Java.

  • Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.

  • In-depth knowledge of standard programming languages such as Scala and/or Java.

  • 3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.

  • 3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.

  • Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.

  • Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.

  • Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.

  • Perform benchmarking/stress tests and document the best practices for different applications.

  • Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.

  • Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.

  • Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.

    Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.

  • Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.

  • Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.

  • Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.

Job posted by
Riya Roy

Spark Scala Developer

at Accion Labs

Founded 2009  •  Products & Services  •  100-1000 employees  •  Profitable
Apache Spark
Scala
Apache Hive
Spark
Hadoop
icon
Bengaluru (Bangalore)
icon
4 - 7 yrs
icon
₹5L - ₹15L / yr

Spark / Scala experience should be more than 2 years.

Combination with Java & Scala is fine or we are even fine with Big Data Developer with strong Core Java Concepts. - Scala / Spark Developer.

Strong proficiency Scala on Spark (Hadoop) - Scala + Java is also preferred

Complete SDLC process and Agile Methodology (Scrum)

Version control / Git

Job posted by
Kripa Oza

Java Developer

at PODIUM SYSTEMS PRIVATE LIMITED

Founded 2016  •  Product  •  employees  •  Bootstrapped
Java
Elastic Search
Solr
Hadoop
Natural Language Processing (NLP)
icon
Pune
icon
1 - 4 yrs
icon
₹4L - ₹7L / yr
  • You will be responsible for design, development and testing of Products
  • Contributing in all phases of the development lifecycle

 

  • Writing well designed, testable, efficient code
  • Ensure designs are in compliance with specifications

  • Prepare and produce releases of software components

  • Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review

  • Some of the technologies you will be working on: Core Java, Solr, Hadoop, Spark, Elastic search, Clustering, Text Mining, NLP, Mahout and Lucene etc.
Job posted by
Saujanya Sathe

Product Development and Solutions Architect

at Fintuple Technologies Private Ltd.

Founded 2018  •  Product  •  20-100 employees  •  Bootstrapped
Java
Product development
RESTful APIs
Spark
Database Design
Angular (2+)
Solution architecture
Technical Architecture
icon
Chennai
icon
5 - 8 yrs
icon
₹10L - ₹16L / yr
Fintuple Technologies is looking to hire a hands on Product Development Architect, a technology geek with experience working with a product startup ( preferably). The Interested candidate, will be responsible for guide lining the architecture of the platform and product development for our next stage of growth. · Should have strong experience in Java, Spark API microservice framework with a complete understanding of REST APIs · Strong proficiency with UI frameworks & languages such as Bootstrap, Angular, TypeScript, jQuery, etc. · Actively find ways (new technologies, tools, frameworks) to improve software solutions · Setup and maintain all environments such as build, staging, and production · Ability to handle multiple technologies, own the troubleshooting and debugging procedures · Experience in agile development methodologies and DevOps practices incl. continuous integration, static code analysis, etc. · Manage the existing team to maintain the product in a fully working condition and upgrade features as and when required · Experience in project management and related tools · Familiarity with various operating systems (e.g. Windows, Mac, Linux) and databases (e.g. MySQL) · Proficient understanding of code versioning tools like Git & SVN · Implementation of security and data protection · Integration of Data Storage Solutions
Job posted by
Naveen Chandramohan

Sr. Data Analyst

at Saama Technologies

Founded 1997  •  Products & Services  •  100-1000 employees  •  Profitable
Data Analytics
MySQL
Python
Spark
Tableau
icon
Pune
icon
6 - 11 yrs
icon
₹1L - ₹12L / yr
Description Requirements: Overall experience of 10 years with minimum 6 years data analysis experience MBA Finance or Similar background profile Ability to lead projects and work independently Must have the ability to write complex SQL, doing cohort analysis, comparative analysis etc . Experience working directly with business users to build reports, dashboards and solving business questions with data Experience with doing analysis using Python and Spark is a plus Experience with MicroStrategy or Tableau is a plu
Job posted by
Sandeep Chaudhary

Java Lead

at BlazeClan Technologies Pvt Ltd

Founded 2011  •  Products & Services  •  100-1000 employees  •  Profitable
Java
Hadoop
bigdata
icon
Pune
icon
5 - 8 yrs
icon
₹12L - ₹16L / yr
BlazeClan is a Premier Amazon Web Services (AWS) Cloud Consulting company, providing cloud consulting and managed services. An organization that is born in the cloud, it has an international presence with offices across the ASEAN region (Malaysia and Singapore), Europe (Belgium and France), U.S.A. and Canada, along with a strong sales presence and primary delivery center in India. Established in 2010, BlazeClan has attained various accolades including AWS Premier Consulting Partner status, AWS Manage Service Partner status, AWS Big Data Competency status, Customer Obsession Recognition 2014 and 2015, Partner of the Year ASEAN 2015 and Consulting Partner – West India.
Job posted by
Gurmeet Singh

Technical Architect/CTO

at auzmor

Founded 2017  •  Product  •  20-100 employees  •  Raised funding
Java
React.js
AngularJS (1.x)
Selenium Web driver
Hadoop
Cassandra
phpunit
codeception
icon
Chennai
icon
3 - 10 yrs
icon
₹10L - ₹30L / yr
Description Auzmor is US HQ’ed, funded SaaS startup focussed on disrupting the HR space. We combine passion, domain expertise and build products with focus on great end user experiences We are looking for Technical Architect to envision, build, launch and scale multiple SaaS products What You Will Do: • Understand the broader strategy, business goals, and engineering priorities of the company and how to incorporate them into your designs of systems, components, or features • Designing applications and architectures for multi-tenant SaaS software • Responsible for the selection and use of frameworks, platforms and design patterns for Cloud based multi-tenant SaaS based application • Collaborate with engineers, QA, product managers, UX designers, partners/vendors, and other architects to build scalable systems, services, and products for our diverse ecosystem of users across apps What you will need • Minimum of 5+ years of Hands on engineering experience in SaaS, Cloud services environments with architecture design and definition experience using Java/JEE, Struts, Spring, JMS & ORM (Hibernate, JPA) or other Server side technologies, frameworks. • Strong understanding of architecture patterns such as multi-tenancy, scalability, and federation, microservices(design, decomposition, and maintenance ) to build cloud-ready systems • Experience with server-side technologies (preferably Java or Go),frontend technologies (HTML/CSS, Native JS, React, Angular, etc.) and testing frameworks and automation (PHPUnit, Codeception, Behat, Selenium, webdriver, etc.) • Passion for quality and engineering excellence at scale What we would love to see • Exposure to Big data -related technologies such as Hadoop, Spark, Cassandra, Mapreduce or NoSQL, and data management, data retrieval , data quality , ETL, data analysis. • Familiarity with containerized deployments and cloud computing platforms (AWS, Azure, GCP)
Job posted by
Loga B
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at Persistent System Ltd?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort