Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, in partnership with UpGrad, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Hadoop Jobs in Bangalore (Bengaluru)

Explore top Hadoop Job opportunities in Bangalore (Bengaluru) for Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.

Big Data/Java Programming

Founded 2007
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 9 years
Experience icon
Best in industry3 - 9 lacs/annum

What You'll Do :- Develop analytic tools, working on BigData and Distributed Environment. Scalability will be the key- Provide architectural and technical leadership on developing our core Analytic platform- Lead development efforts on product features on Java- Help scale our mobile platform as we experience massive growthWhat we Need :- Passion to build analytics & personalisation platform at scale- 3 to 9 years of software engineering experience with product based company in data analytics/big data domain- Passion for the Designing and development from the scratch.- Expert level Java programming and experience leading full lifecycle of application Dev.- Exp in Analytics, Hadoop, Pig, Hive, Mapreduce, ElasticSearch, MongoDB is an additional advantage- Strong communication skills, verbal and written

Job posted by
apply for job
apply for job
khushboo jain picture
khushboo jain
Job posted by
khushboo jain picture
khushboo jain
Apply for job
apply for job

Data Engineer

via Rupeek
Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
Best in industry12 - 22 lacs/annum

As a Data Engineer you will: Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for a Data Engineer: 4+ years of experience in a Data Engineer role Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Experience building and optimizing 'big data' data pipelines, architectures and data sets Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Strong analytic skills related to working with unstructured datasets Build processes supporting data transformation, data structures, metadata, dependency and workload management A successful history of manipulating, processing and extracting value from large disconnected datasets Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores Experience with big data tools: Hadoop, Spark, Kafka, etc Experience with relational SQL and NoSQL databases, including Postgres and Cassandra Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, RDS, Redshift, Kinesis Experience with stream-processing systems: Storm, Spark-Streaming, etc Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc Rupeek tech Stack: You can take a look at our tech stack here: http://stackshare.io/AmarPrabhu/rupeek

Job posted by
apply for job
apply for job
Bhavana Y.C picture
Bhavana Y.C
Job posted by
Bhavana Y.C picture
Bhavana Y.C
Apply for job
apply for job

PySpark Developer

via IQVIA
Founded 1969
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Experience icon
Best in industry10 - 17 lacs/annum

1. Advanced Py-Spark(Python + Spark) – 5-7 years of experience in Python - Must 2. Distributed Processing(Cloudera Cluster experience, CDSW etc) – Good to have 3. Object Oriented Programming in Python - Must 4. Writing unit tests in Python - Must 5. Big Data skills like Hive, Hadoop, Map Reduce – Good to have 6. Good knowledge of Git(branching, merging, regular commits) - Must 7. Software Dev Experience- Must 8. Best coding practices- Must 9. Prod-Ops Knowledge(Nice to have) 10. Experience in leading teams – Senior developer and should be able to lead team in future 11. Continuous integration and continuous delivery – Good to have 12. Agile - Must

Job posted by
apply for job
apply for job
Ambili Sasidharan picture
Ambili Sasidharan
Job posted by
Ambili Sasidharan picture
Ambili Sasidharan
Apply for job
apply for job

Senior Software Engineer - Big Data

Founded 2003
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 9 years
Experience icon
Best in industry20 - 27 lacs/annum

Description At LogMeIn, we build beautifully simple and easy-to-use Cloud-based, cross-platform Web, Mobile and Desktop software products. You probably know us by such industry-defining brand names as GoToMeeting®, GoToWebinar®, JoinMe®, LastPass®, Rescue® and BoldChat® as well as other award winning products and services. LogMeIn enables customers around the world to enjoy highly productive, mobile workstyles. Currently, we’re searching for a high caliber and innovative Big Data and Analytics Engineer who will provide useful insights into the data and enable the stakeholders make Data Driven decisions. He’ll be part of the team building the next generation data platform on Cloud using cutting edge technologies like Spark, Presto, Kinesis, EMR, Pig, Hive, Redshift. If you're passionate about building high quality software for data, thrive in an innovative, cutting-edge startup-like environment, and consider yourself to be a top-notch Data Engineer, then LogMeIn could very well be the perfect fit for you and your career. Responsibilities • Responsible for analysis, design and development activities on multiple projects; plans, organizes, and performs the technical work within area of specialization • Participates in design activity with other programmers on technical aspects relating to the project, including functional specifications, design parameters, feature enhancements, and alternative solutions; • Meets or exceeds standards for the quality and timeliness of the work products that they create (e.g., requirements, designs, code, fixes). • Implements, unit tests, debugs and integrates complex code; designs, writes, conducts, and directs the development of tests to verify the functionality, accuracy, and efficiency of developed or enhanced software; analyzes results for conformance to plans and specifications making recommendations based on the results • Generally provides technical direction and project management within a project/scrum team with increased leadership of others; provides guidance in methodology selection, project planning, the review of work products; may serve in a part-time technical lead capacity to a limited number of junior engineers, providing immediate direction and guidance • Keeps technically abreast of trends and advancements within area of specialization, incorporating these improvements where applicable; attends technical conferences as appropriate Requirements • Bachelor’s degree or equivalent in computer science or related field is preferred, with 5-8 years of directly related work experience • Hands-on experience designing, developing and maintaining high-volume ETL processes using Big Data technologies like Pig, Hive, Oozie, Spark, MapReduce • Solid understanding of Data Warehousing concepts • Strong understanding of Dimensional Data Modeling • Experience in using Hadoop, S3, MapReduce, Redshift, RDS on AWS • Expertise in at least one Visualization tool like Tableau, Quicksight, PowerBI, Sisense, Birst, QlikView, Looker etc. • Experience in working on processing real time streaming data • Strong SQL and Stored Procedure development skills. Knowledge of NoSQL is an added plus • Knowledge of Java to leverage Big Data technologies is desired • Knowledge of scripting language preferably Python or statistical programming language R is desired • Working knowledge of Linux environment • Knowledge of SDLC and Agile development methodologies • Expertise in OOAD principles and methodologies (e.g., UML) and OS concepts • Extensive knowledge and discipline in software engineering process; experience as a technical lead on complex projects, providing guidance on design and development approach • Expertise implementing, unit testing, debugging and integrating code of moderate complexity • Experience helping others to design, write, conduct, and direct the development of tests • Experience independently publishing papers, blogs, and creating and presenting briefings to technical audiences • Strong critical thinking and problem solving skills • Approaches problems with curiosity and open-mindedness

Job posted by
apply for job
apply for job
Kunal Banerjee picture
Kunal Banerjee
Job posted by
Kunal Banerjee picture
Kunal Banerjee
Apply for job
apply for job

Cloud Engineer

via UpGrad
Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Mumbai, Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
Best in industry8 - 14 lacs/annum

About Us UpGrad is an IIT Delhi alumni and Ronnie Screwvala founded company where we focus on enabling universities to take their programs online. Given team's background in education and media sectors, we understand what it takes to offer quality online programs, and at UpGrad - we invest alongside universities to build and deliver quality online programs (content, platform, technology, industry collaboration, delivery, and grading infrastructure). You can read about some of our press releases at - • UpGrad was earlier selected as one of the top ten most innovative companies in India by FastCompany. • We were also covered by the Financial Times along with other disruptors in Ed-Tech • UpGrad is the official education partner for Government of India - Startup India program too • We were also ranked as one of the top 25 Startups in India 2018 • Our program with IIIT B has been ranked #1 program in the country in the domain of Artificial Intelligence and Machine Learning At UpGrad - we have partnered with leading universities such as IIIT Bangalore, BITS Pilani, MICA Ahmedabad, IMT Ghaziabad and Cambridge University's Judge Business School to offer programs in the domains of Data, Technology and Management. Role and Responsibilities 1. Administration of virtual learning lab: Handle the setup and administration of the virtual labs to be used by the students enrolled in various courses like Big Data, Data Analytics. The students use these labs for practice and also run their assignments. 2. Student experience (post-program launch): Assist students with their academic doubts related to the virtual labs and ensure students have a great learning experience on the UpGrad platform 3. Academic quality assurance: Help create learning material with an in-house team of instructional designers and review its technical quality. What we are looking for: 1. 3-4 years project experience deploying cloud solutions (experience on Amazon Web Services (AWS) is mandatory) 2. Hands-on experience in setting up and day to day administration of Hadoop Ecosystem Tools(Hadoop, Spark, Storm, Hbase), NoSQL, Visualisations, etc. 3. Must be a problem solver with demonstrated experience in solving difficult technology challenges, with a can-do attitude 4. Hands-on working with private or public cloud services in a highly available and scalable production environment. 5. Experience building tools and automation that eliminate repetitive task 6. Hands on experience with Service Cloud, including User Permissions, Roles, Objects, Validation Rules, Process Builder, Workflow Rules, Communities, Visual Workflow, Email to Case, Case Management

Job posted by
apply for job
apply for job
Omkar Pradhan picture
Omkar Pradhan
Job posted by
Omkar Pradhan picture
Omkar Pradhan
Apply for job
apply for job

Java/J2EE/Python

Founded 2004
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Remote, Bengaluru (Bangalore)
Experience icon
4 - 8 years
Experience icon
Best in industry12 - 24 lacs/annum

Position : R & D - Senior EngineerReportsTo : Chief ArchitectExperience : 4+ YearsEducation : BE/ME/MSJob Summary :- We are seeking a highly-skilled, experienced Java developer to join our R & D team. In this role, you will help experiment various Proof of concept's by employing a lot of new and bleeding edge technologies. Compare with other similar technologies and draw merits and demerits. - Demonstrate MVP with small use cases, once reviewed and approved, design and develop a first cut solution that is scalable, relevant, and critical to our company's success and hand over to engineering team to take it forward and guide them to make it full-fledged product/service/solution. - You will focus on Java/ Java EE / Python development throughout and must have a solid skill set, problem solving ability, analytical thinking and a desire to continue to grow as a developer, and a team-player mentality. POC involves experimenting with bleeding edge technologies across languages like Java and Python. Duties and Responsibilities :- Provide solution's in terms of new technology/tool/service for current technology bottlenecks of the product(s)- Work on Proof of concepts for product / business requirements by employing latest technologies to understand it's fitment in product's in technology stack and evaluate its merits and demerits.- Gather requirements from internal and external stakeholders- Participate in the design and implementation of essential applications- Demonstrate expertise and add valuable input throughout the POC/development lifecycle- Help design and implement scalable, lasting technology solutions- Review current systems, suggesting updates as needed- Test and debug new applications and updates- Resolve reported issues and reply to queries in a timely manner- Develop and utilize technical change documentation- Strive to deploy all products and updates on time- Help improve code quality by implementing recommended best practices- Remain up to date on all current best practices, trends, and industry developments- Maintain a high standard of work quality and encourage others to do the same- Help junior team members grow and develop their skills- Identify potential challenges and bottlenecks to address them proactivelyRequirements and Qualifications :- BS/MS/MTech in computer science or related field required- Minimum 4 years of experience in reputed Software firm- Strong knowledge on computer science fundamentals like Algorithms and Data structures- Strong problem thinking and analytical thinking capability- Strong working knowledge of Java and J2EE technologies- Significant experience working with SQL - Significant experience working with NoSQL like mongo/dynamo/memsql/graph DB- Significant experience working with Elastic Cache- Significant experience working with Distributed Architecture- Knowledge or working experience in Python- Significant experience working with Web Services, REST Frameworks- Experience with AWS (S3, Lambda, Kinesis, SQS) highly desired- Experience with frameworks like Spring, Hadoop, Spark, Kafka a plus- Experience with Machine Learning, NLP a plus- Familiarity with Elasticsearch- Familiarity with Java web application servers like Tomcat, Weblogic, Jboss- Familiarity with micro services and/or Spring Boot- Familiarity with HTML, CSS, Java script- Having hobby projects is a plusManthan Profile :Manthan is the Chief Analytics Officer for consumer industries worldwide. Manthan's portfolio of analytics-enabled business applications, advanced analytics platforms and solutions are architected to help users across industries walk the complete data-to-result path - analyze, take guided decisions and execute these decisions real-time. Sophisticated, yet intuitive analytical capability coupled with the power of big data, mobility and cloud computing, brings users business-ready applications that provide on-demand access and real-time execution - the only path to profit in a contemporary, on-demand and connected economy. Manthan serves over 200 leading organizations across 23 countries. With the recent introduction of Maya, the world's first AI powered conversational agent for business analytics, Manthan is pioneering the move to zero touch UIs and transforming user interactions with complex analytics applications. Manthan is one of the most awarded analytics innovators among analysts and customers alike. To learn how businesses can gain from analytics, please visit https://www.manthan.com

Job posted by
apply for job
apply for job
Gautham K picture
Gautham K
Job posted by
Gautham K picture
Gautham K
Apply for job
apply for job

Senior Software Engineer (Java/Scala/SOLR/Data/Hadoop)

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Experience icon
Best in industry10 - 15 lacs/annum

Interested in building high performance search systems to handle petabytes of retail data, while working in an agile, small company environment? At CodeHall Technologies, you will have the opportunity to work with the newest technology in Search and Browse.  We are working on systems that powers and personalizes site search, considering the user intent for every query, providing a wholly unique search experience that is engaging - designed to display the most relevant results through Findability.  Primary responsibilities:   Building high performance Search systems for personalization, optimization, and targeting Building systems with Hadoop, Solr, Cassandra, Flink, Spark, Mongo DB Deep understanding of HTTP and REST principles Good diagnostic and troubleshooting skills… Unit testing with JUnit, Performance testing and tuning Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments Highly proficient Software engineering skills in Java Coordination with internal and external teams Mentoring junior engineers Participate in Product design discussions and decisions Minimum requirements: BS/MS in CS, Electrical Engineering or foreign equivalent plus relevant software development experience At least 5-8 years of software development experience Expert in Java, Scala or any other object oriented language Proficient in SQL concepts (HiveQL or Postgres a plus) Additional language skills for scripting and rapid application development Desired skills and experience: Working with large data sets in the PBs Familiarity with UNIX (systems skills a plus) Working experience in Solr, Cassandra, Mongo DB, and Hadoop Working in a distributed environment and has dealt with challenges around scaling and performance Proven ability to project and meet scheduled deadlines Self-driven, quick learner with attention to detail and quality

Job posted by
apply for job
apply for job
Avneesh Jain picture
Avneesh Jain
Job posted by
Avneesh Jain picture
Avneesh Jain
Apply for job
apply for job

Python Developer - Data Analytics Startup

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 5 years
Experience icon
Best in industry4 - 10 lacs/annum

Crediwatch - Amplified Intelligence Crediwatch is in the business of extracting valuable insights by applying artificial intelligence and deep learning and delivering them to chief decision makers. Crediwatch believes in zero human touch and building systems to augment human intelligence. Role We are looking for smart dedicated resource with strong fundamentals and coding skills for the python engineering team at our data analytics startup. You will be responsible for design, dev and taking to production all the python based projects. You will drive the agile development process for the python based teams/projects. You will work on every level of the stack. You will partake in driving software standards and guidelines, performance analysis, benchmarking and detailed design of the system. Desired Candidate Profile You should possess the aptitude to take up any task, investigate independently and come up with multiple solutions highlighting the pros and cons for each. You need to have a healthy startup attitude. We are looking for candidates who have a solid understanding of  concepts, best practices and have a good approach to learning and problem solving. Experience At least 2 years of hands on python development experience, with sound knowledge of design patterns and application design Must Have Strong knowledge of Python - along with hands on experience in using Django Experience with NoSQL, specifically MongoDB Good understanding of queuing mechanisms including Redis/RabitMQ (or similar) Knowledge of python web crawling frameworks like scrapy, frontera is a must Strong Linux skills Knowledge of building large scale, multi threaded applications is a plus Nice to Have Experience designing and building RESTful web services Experience designing, configuring and implementing Hadoop/HDFS and HBase Knowledge of graph databases Experience with Elasticsearch Prior experience architecting large scale distributed systems Experience with cloud deployments on AWS/Azure/DO preferred Experience with unit testing. Desired mindset You should possess the aptitude to take up any task, investigate independently and come up with multiple solutions highlighting the pros and cons for each. You need to have a healthy startup attitude. We are looking for candidates who have a good approach to learning and problem solving. Just to Add We have a Creative Workplace and open work culture. Creativity and out of the box thinking is encouraged and nurtured. Some perks: Excellent Filter Coffee, Free lunches, PS4 and Fooseball breaks, Mobile/broadband allowances, freedom to sit in any corner of the office, stocked kitchen topped up with a nice set of people to work with!

Job posted by
apply for job
apply for job
Rohan Pannalkar picture
Rohan Pannalkar
Job posted by
Rohan Pannalkar picture
Rohan Pannalkar
Apply for job
apply for job

Senior Python Developer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
Best in industry6 - 14 lacs/annum

Crediwatch offers the opporunity to work on the latest of technologies and is building a completely automated platform for curating public domain information to help build insights and take business decisions. Crediwatch is working with a host of clients across Banking, NBFC, Legal and Technology clients to present data and insights like never before. Crediwatch has received accolades in Citibank Tech4Integrity challenge (worldwide), Barclays Rise accelerator and Tech30 by YourStory to name a few. We are based in the heart of Bangalore and are growing fast. For the current opening, the following are the requirements Primary skills 1. At least 3 years of hands on development experience, with sound knowledge of design patterns and application design 2. Strong knowledge of Python - along with hands on experience in using Django 3. Experience with NoSQL, specifically MongoDB 4. Good understanding of queuing mechanisms including Redis/RabitMQ (or similar) 5. Knowledge of python web crawling frameworks like scrapy, frontera is a must 6. Strong Linux skills 7. Knowledge of building large scale, multi threaded applications is a plus Secondary skills 1. Experience designing and building RESTful web services 2. Experience designing, configuring and implementing Hadoop/HDFS and HBase is a plus 3. Knowledge of graph databases is a bonus 4. Experience with Elasticsearch is a plus 5. Prior experience architecting large scale distributed systems is a plus 6. Experience with cloud deployments on AWS/Azure preferred Roles & Responsibility 1. We are looking for a dedicated Sr. resource to lead the python driven dev team 2. Person will be responsible for design, dev and taking to production all the python based projects 3. Will also serve as a mentor - guiding and enabling the team to deliver 4. Will drive the agile development process for the python based teams/projects 5. Work on every level of the stack 6. Drive software standards and guidelines, performance analysis, benchmarking and detailed design of the system 7. Work in a highly agile, fast growing, startup environment

Job posted by
apply for job
apply for job
Rohan Pannalkar picture
Rohan Pannalkar
Job posted by
Rohan Pannalkar picture
Rohan Pannalkar
Apply for job
apply for job

Senior Python Developer

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 7 years
Experience icon
Best in industry6 - 14 lacs/annum

Crediwatch offers the opporunity to work on the latest of technologies and is building a completely automated platform for curating public domain information to help build insights and take business decisions. Crediwatch is working with a host of clients across Banking, NBFC, Legal and Technology clients to present data and insights like never before. Crediwatch has received accolades in Citibank Tech4Integrity challenge (worldwide), Barclays Rise accelerator and Tech30 by YourStory to name a few. We are based in the heart of Bangalore and are growing fast. For the current opening, the following are the requirements Primary skills 1. At least 3 years of hands on development experience, with sound knowledge of design patterns and application design 2. Strong knowledge of Python - along with hands on experience in using Django 3. Experience with NoSQL, specifically MongoDB 4. Good understanding of queuing mechanisms including Redis/RabitMQ (or similar) 5. Knowledge of python web crawling frameworks like scrapy, frontera is a must 6. Strong Linux skills 7. Knowledge of building large scale, multi threaded applications is a plus Secondary skills 1. Experience designing and building RESTful web services 2. Experience designing, configuring and implementing Hadoop/HDFS and HBase is a plus 3. Knowledge of graph databases is a bonus 4. Experience with Elasticsearch is a plus 5. Prior experience architecting large scale distributed systems is a plus 6. Experience with cloud deployments on AWS/Azure preferred Roles & Responsibility 1. We are looking for a dedicated Sr. resource to lead the python driven dev team 2. Person will be responsible for design, dev and taking to production all the python based projects 3. Will also serve as a mentor - guiding and enabling the team to deliver 4. Will drive the agile development process for the python based teams/projects 5. Work on every level of the stack 6. Drive software standards and guidelines, performance analysis, benchmarking and detailed design of the system 7. Work in a highly agile, fast growing, startup environment

Job posted by
apply for job
apply for job
Rohan Pannalkar picture
Rohan Pannalkar
Job posted by
Rohan Pannalkar picture
Rohan Pannalkar
Apply for job
apply for job

DevOps Engineer

Founded 1998
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 8 years
Experience icon
Best in industry10 - 40 lacs/annum

What is the Job like?We are looking for a talented individual to join our DevOps and Platforms Engineering team. You will play an important role in helping build and run our globally distributed infrastructure stack and platforms. Technologies you can expect to work on every day include Linux, AWS, MySQL/PostgreSQL, MongoDB, Hadoop/HBase, ElasticSearch, FreeSwitch, Jenkins, Nagios, and CFEngine amongst others.Responsibilities:- * Troubleshoot and fix production outages and performance issues in our AWS/Linux infrastructure stack* Build automation tools for provisioning and managing our cloud infrastructure by leveraging the AWS API for EC2, S3, CloudFront, RDS and Route53 amongst others* Contribute to enhancing and managing our continuous delivery pipeline* Proactively seek out opportunities to improve monitoring and alerting of our hosts and services, and implement them in a timely fashion* Code scripts and tools to collect and visualize metrics from linux hosts and JVM applications* Enhance and maintain our logs collection, processing and visualization infrastructure* Automate systems configuration by writing policies and modules for configuration management tools* Write both frontend (html/css/js) and backend code (Python, Ruby, Perl)* Participate in periodic oncall rotations for devopsSkills:- * DevOps/System Admin experience ranging between 3-4 years* In depth Linux/Unix knowledge, good understanding the various linux kernel subsystems (memory, storage, network etc)* DNS, TCP/IP, Routing, HA & Load Balancing* Configuration management using tools like CFEngine, Puppet or Chef* SQL and NoSQL databases like MySQL, PostgreSQL, MongoDB and HBase* Build and packaging tools like Jenkins and RPM/Yum* HA and Load balancing using tools like the Elastic Load Balancer and HAProxy* Monitoring tools like Nagios, Pingdom or similar* Log management tools like logstash, fluentd, syslog, elasticsearch or similar* Metrics collection tools like Ganglia, Graphite, OpenTSDB or similar* Programming in a high level language like Python or Ruby

Job posted by
apply for job
apply for job
Richa Pancholy picture
Richa Pancholy
Job posted by
Richa Pancholy picture
Richa Pancholy
Apply for job
apply for job

Hadoop Developer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 7 years
Experience icon
Best in industry23 - 30 lacs/annum

Position Description Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 5+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Hadoop Engineers

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 7 years
Experience icon
Best in industry24 - 30 lacs/annum

Position Description Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 5+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Hadoop Lead Engineers

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
7 - 9 years
Experience icon
Best in industry27 - 34 lacs/annum

Position Description Assists in providing guidance to small groups of two to three engineers, including offshore associates, for assigned Engineering projects Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Generate weekly, monthly and yearly report using JIRA and Open source tools and provide updates to leadership teams. Proactively identify issues, identify root cause for the critical issues. Work with cross functional teams, Setup KT sessions and mentor the team members. Co-ordinate with Sunnyvale and Bentonville teams. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 8+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Data Science Engineer (SDE I)

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years
Experience icon
Best in industry12 - 20 lacs/annum

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server. Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

Senior Data Engineer (SDE II)

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 7 years
Experience icon
Best in industry15 - 30 lacs/annum

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. The founding team consists of BITS Pilani alumni with experience of creating global startup success stories. The core team, we are building, consists of some of the best minds in India in artificial intelligence research and data engineering. We are looking for multiple different roles with 2-7 year of research/large-scale production implementation experience with: - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics, and semantic search. - Or credible research experience in innovating new ML algorithms and neural nets. Github profile link is highly valued. For right fit into the Couture.ai family, compensation is no bar.

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

Lead Data Engineer (SDE III)

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Experience icon
Best in industry25 - 55 lacs/annum

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects is a must. We are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with: - Proven expertise in Spark, Kafka, and Hadoop ecosystem. - Rock-solid algorithmic capabilities. - Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search. - Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack. - Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production. Tier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

Database Architect

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 10 years
Experience icon
Best in industry10 - 20 lacs/annum

candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts

Job posted by
apply for job
apply for job
Rahul Malani picture
Rahul Malani
Job posted by
Rahul Malani picture
Rahul Malani
Apply for job
apply for job

Hadoop Administrator

Founded 2008
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
2 - 5 years
Experience icon
Best in industry5 - 15 lacs/annum

Securonix is a security analytics product company. Our product provides real-time behavior analytics capabilities and uses the following Hadoop components - Kafka, Spark, Impala, HBase. We support very large customers for all our customers globally, with full access to the cluster. Cloudera Certification is a big plus.

Job posted by
apply for job
apply for job
Ramakrishna Murthy picture
Ramakrishna Murthy
Job posted by
Ramakrishna Murthy picture
Ramakrishna Murthy
Apply for job
apply for job

HBase Architect Developer

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[1 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years
Experience icon
Best in industry6 - 20 lacs/annum

www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!

Job posted by
apply for job
apply for job
Debdas Sinha picture
Debdas Sinha
Job posted by
Debdas Sinha picture
Debdas Sinha
Apply for job
apply for job

Freelance Faculty

Founded 2009
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Anywhere, United States, Canada
Experience icon
3 - 10 years
Experience icon
Best in industry2 - 10 lacs/annum

To introduce myself I head Global Faculty Acquisition for Simplilearn. About My Company: SIMPLILEARN is a company which has transformed 500,000+ carriers across 150+ countries with 400+ courses and yes we are a Registered Professional Education Provider providing PMI-PMP, PRINCE2, ITIL (Foundation, Intermediate & Expert), MSP, COBIT, Six Sigma (GB, BB & Lean Management), Financial Modeling with MS Excel, CSM, PMI - ACP, RMP, CISSP, CTFL, CISA, CFA Level 1, CCNA, CCNP, Big Data Hadoop, CBAP, iOS, TOGAF, Tableau, Digital Marketing, Data scientist with Python, Data Science with SAS & Excel, Big Data Hadoop Developer & Administrator, Apache Spark and Scala, Tableau Desktop 9, Agile Scrum Master, Salesforce Platform Developer, Azure & Google Cloud. : Our Official website : www.simplilearn.com If you're interested in teaching, interacting, sharing real life experiences and passion to transform Careers, please join hands with us. Onboarding Process • Updated CV needs to be sent to my email id , with relevant certificate copy. • Sample ELearning access will be shared with 15days trail post your registration in our website. • My Subject Matter Expert will evaluate you on your areas of expertise over a telephonic conversation - Duration 15 to 20 minutes • Commercial Discussion. • We will register you to our on-going online session to introduce you to our course content and the Simplilearn style of teaching. • A Demo will be conducted to check your training style, Internet connectivity. • Freelancer Master Service Agreement Payment Process : • Once a workshop/ Last day of the training for the batch is completed you have to share your invoice. • An automated Tracking Id will be shared from our automated ticketing system. • Our Faculty group will verify the details provided and share the invoice to our internal finance team to process your payment, if there are any additional information required we will co-ordinate with you. • Payment will be processed in 15 working days as per the policy this 15 days is from the date of invoice received. Please share your updated CV to get this for next step of on-boarding process.

Job posted by
apply for job
apply for job
STEVEN JOHN picture
STEVEN JOHN
Job posted by
STEVEN JOHN picture
STEVEN JOHN
Apply for job
apply for job

Senior Software Engineer

via zeotap
Founded 2014
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 10 years
Experience icon
Best in industry5 - 40 lacs/annum

Check our JD: https://www.zeotap.com/job/senior-tech-lead-m-f-for-zeotap/oEQK2fw0

Job posted by
apply for job
apply for job
Projjol Banerjea picture
Projjol Banerjea
Job posted by
Projjol Banerjea picture
Projjol Banerjea
Apply for job
apply for job

Big Data Engineer

Founded 2007
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 10 years
Experience icon
Best in industry16 - 35 lacs/annum

- Passion to build analytics & personalisation platform at scale - 4 to 9 years of software engineering experience with product based company in data analytics/big data domain - Passion for the Designing and development from the scratch. - Expert level Java programming and experience leading full lifecycle of application Dev. - Exp in Analytics, Hadoop, Pig, Hive, Mapreduce, ElasticSearch, MongoDB is an additional advantage - Strong communication skills, verbal and written

Job posted by
apply for job
apply for job
Vijaya Kiran picture
Vijaya Kiran
Job posted by
Vijaya Kiran picture
Vijaya Kiran
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.