Loading...

{{notif_text}}

SocialHelpouts is now CutShort! Read about it here
Who pays how much? Be informed with this salary report on Indian startups.

Hadoop Administrator
Posted by Ramakrishna Murthy

apply to this job

Locations

Bengaluru (Bangalore)

Experience

2 - 5 years

Salary

INR 5L - 15L

Skills

Hadoop
Cloudera
Hortonworks

Job description

Securonix is a security analytics product company. Our product provides real-time behavior analytics capabilities and uses the following Hadoop components - Kafka, Spark, Impala, HBase. We support very large customers for all our customers globally, with full access to the cluster. Cloudera Certification is a big plus.

About the company

undefined

Founded

2008

Type

Product

Size

250+ employees

Stage

Bootstrapped
View company

Similar jobs

Data Scientist

Founded 2013
Product
6-50 employees
Raised funding
Big Data
Data Science
Machine Learning
R Programming
Python
Haskell
Hadoop
Location icon
Mumbai
Experience icon
3 - 7 years

Data Scientist - We are looking for a candidate to build great recommendation engines and power an intelligent m.Paani user journey Responsibilities : - Data Mining using methods like associations, correlations, inferences, clustering, graph analysis etc. - Scale machine learning algorithm that powers our platform to support our growing customer base and increasing data volume - Design and implement machine learning, information extraction, probabilistic matching algorithms and models - Care about designing the full machine learning pipeline. - Extending company's data with 3rd party sources. - Enhancing data collection procedures. - Processing, cleaning and verifying data collected. - Ad hoc analysis of the data and present clear results. - Creating advanced analytics products that provide actionable insights. The Individual : - We are looking for a candidate with the following skills, experience and attributes: Required : - Someone with 2+ years of work experience in machine learning. - Educational qualification relevant to the role. Degree in Statistics, certificate courses in Big Data, Machine Learning etc. - Knowledge of Machine Learning techniques and algorithms. - Knowledge in languages and toolkits like Python, R, Numpy. - Knowledge of data visualization tools like D3,js, ggplot2. - Knowledge of query languages like SQL, Hive, Pig . - Familiar with Big Data architecture and tools like Hadoop, Spark, Map Reduce. - Familiar with NoSQL databases like MongoDB, Cassandra, HBase. - Good applied statistics skills like distributions, statistical testing, regression etc. Compensation & Logistics : This is a full-time opportunity. Compensation will be in line with startup, and will be based on qualifications and experience. The position is based in Mumbai, India, and the candidate must live in Mumbai or be willing to relocate.

Job posted by
message
Job poster profile picture - Julie K
Julie K
Job posted by
Job poster profile picture - Julie K
Julie K
message now

Database Architect

Founded 2017
Products and services
6-50 employees
Raised funding
ETL
Data Warehouse (DWH)
DWH Cloud
Hadoop
Apache Hive
Spark
Mango DB
PostgreSQL
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years

he candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
message
Job poster profile picture - Rahul Malani
Rahul Malani
Job posted by
Job poster profile picture - Rahul Malani
Rahul Malani
message now

Data Scientist

Founded 2015
Services
6-50 employees
Profitable
Big Data
Data Science
Machine Learning
R Programming
Python
Haskell
Hadoop
Location icon
Hyderabad
Experience icon
6 - 10 years

It is one of the largest communication technology companies in the world. They operate America's largest 4G LTE wireless network and the nation's premiere all-fiber broadband network.

Job posted by
message
Job poster profile picture - Sangita Deka
Sangita Deka
Job posted by
Job poster profile picture - Sangita Deka
Sangita Deka
message now

Data Scientist

Founded 2011
Services
6-50 employees
Raised funding
Big Data
Data Science
Machine Learning
R Programming
Python
Haskell
Hadoop
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
2 - 4 years

URGENT! My client is looking for a Data Scientist, M. Tech / PHD, from Tier 1 Institutes (such as IIT / IISC), Min. 2 -3 years of experience, with skills in R, Python, Machine Learning. Position is with a very successful product development startup in the field of Artificial Intelligence and Big Data Analytics, based in Gurgaon. Send in your resumes at info@bidbox.in

Job posted by
message
Job poster profile picture - Suchit Aggarwal
Suchit Aggarwal
Job posted by
Job poster profile picture - Suchit Aggarwal
Suchit Aggarwal
message now

Data Scientist

Founded 2017
Services
6-50 employees
Bootstrapped
Big Data
Data Science
Machine Learning
R Programming
Haskell
Hadoop
Python
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years

Woovly, an early stage startup is about awakening interests, hobbies, and bucket lists of an individual. We at Woovly believe that every individual has a passion for some activity and that when pursued and accomplished gives him immense happiness. Woovly connects all such individuals based on their common passions. We are in the final stage of building the online platform that enables the social networking based on common interests.

Job posted by
message
Job poster profile picture - Neha Suyal
Neha Suyal
Job posted by
Job poster profile picture - Neha Suyal
Neha Suyal
message now

Data Scientist

Founded 2011
Product
250+ employees
Raised funding
Big Data
Data Science
Machine Learning
R Programming
Python
Haskell
Hadoop
Location icon
NCR (Delhi | Gurgaon | Noida)
Experience icon
1 - 4 years

Position:-Data Scientist Location :- Gurgaon Job description Shopclues is looking for talented Data Scientist passionate about building large scale data processing systems to help manage the ever-growing information needs of our clients. Education : PhD/MS or equivalent in Applied mathematics, statistics, physics, computer science or operations research background. 2+ years experience in a relevant role. Skills · Passion for understanding business problems and trying to address them by leveraging data - characterized by high-volume, high dimensionality from multiple sources · Ability to communicate complex models and analysis in a clear and precise manner · Experience with building predictive statistical, behavioural or other models via supervised and unsupervised machine learning, statistical analysis, and other predictive modeling techniques. · Experience using R, SAS, Matlab or equivalent statistical/data analysis tools. Ability to transfer that knowledge to different tools · Experience with matrices, distributions and probability · Familiarity with at least one scripting language - Python/Ruby · Proficiency with relational databases and SQL Responsibilities · Has worked in a big data environment before alongside a big data engineering team (and data visualization team, data and business analysts) · Translate client's business requirements into a set of analytical models · Perform data analysis (with a representative sample data slice) and build/prototype the model(s) · Provide inputs to the data ingestion/engineering team on input data required by the model, size, format, associations, cleansing required

Job posted by
message
Job poster profile picture - Shrey Srivastava
Shrey Srivastava
Job posted by
Job poster profile picture - Shrey Srivastava
Shrey Srivastava
message now

Hadoop Administrator

Founded 2008
Product
250+ employees
Bootstrapped
Hadoop
Cloudera
Hortonworks
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 5 years

Securonix is a security analytics product company. Our product provides real-time behavior analytics capabilities and uses the following Hadoop components - Kafka, Spark, Impala, HBase. We support very large customers for all our customers globally, with full access to the cluster. Cloudera Certification is a big plus.

Job posted by
message
Job poster profile picture - Ramakrishna Murthy
Ramakrishna Murthy
Job posted by
Job poster profile picture - Ramakrishna Murthy
Ramakrishna Murthy
message now

Hadoop Developer

Founded 2008
Product
250+ employees
Bootstrapped
HDFS
Apache Flume
Apache HBase
Hadoop
Impala
Apache Kafka
SOLR Cloud
Apache Spark
Location icon
Pune
Experience icon
3 - 7 years

Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.

Job posted by
message
Job poster profile picture - Ramakrishna Murthy
Ramakrishna Murthy
Job posted by
Job poster profile picture - Ramakrishna Murthy
Ramakrishna Murthy
message now

HBase Architect Developer

Founded 2017
Products and services
6-50 employees
Bootstrapped
Apache HBase
Hadoop
MapReduce
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years

www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!

Job posted by
message
Job poster profile picture - Debdas Sinha
Debdas Sinha
Job posted by
Job poster profile picture - Debdas Sinha
Debdas Sinha
message now

Data Scientist

Founded 2012
Product
51-250 employees
Profitable
Big Data
Data Science
Machine Learning
R Programming
Python
Haskell
Hadoop
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 6 years

Do apply if any of this sounds familiar! o You have expertise in NLP, Machine Learning, Information Retrieval and Data Mining. o Experience building systems based on machine learning and/or deep learning methods. o You have expertise in Graphical Models like HMM, CRF etc. o Familiar with learning to rank, matrix factorization, recommendation system. o You are familiar with the latest data science trends, tools and packages. o You have strong technical and programming skills. You are familiar with relevant technologies and languages (e.g. Python, Java, Scala etc.) o You have knowledge of Lucene based search-engines like ElasticSearch, Solr, etc and NoSQL DBs like Neo4j and MongoDB. o You are really smart and you have some way of proving it (e.g. you hold a MS/M.Tech or PhD in Computer Science, Machine Learning, Mathematics, Statistics or related field). o There is at least one project on your resume that you are extremely proud to present. o You have at least 4 years’ experience driving projects, tackling roadblocks and navigating solutions/projects through to completion o Execution - ability to manage own time and work effectively with others on projects o Communication - excellent verbal and written communication skills, ability to communicate technical topics to non-technical individuals Good to have: o Experience in a data-driven environment: Leveraging analytics and large amounts of (streaming) data to drive significant business impact. o Knowledge of MapReduce, Hadoop, Spark, etc. o Experience in creating compelling data visualizations

Job posted by
message
Job poster profile picture - Naveen Taalanki
Naveen Taalanki
Job posted by
Job poster profile picture - Naveen Taalanki
Naveen Taalanki
message now
Why waste time browsing all the jobs?
Just chat with Voila, your professional assistant. Find jobs, compare your salary and much more!
talk to voila
Awesome! You have connected your Facebook account. Like us on Facebook to stay updated.