Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more
The job you are looking for does not exist. Check out similar jobs below.

Similar jobs

Hadoop Developer

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 7 years
Experience icon
23 - 30 lacs/annum

Position Description Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 5+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Senior Engineer - Hadoop

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 10 years
Experience icon
25 - 50 lacs/annum

Senior Engineer - Development Our Company We help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. @ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. Our Team The Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc. Our team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support. Your Opportunity As part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers. Your Responsibility • Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale • Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. • Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. • Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums • Develop and/or Contribute to add features that enable customer analytics at Walmart scale • Deploy and monitor products on Cloud platforms • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs Our Ideal Candidate You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. Your Qualifications • Bachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field • Expertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc. • Expertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc). • Experience in building scalable/highly available distributed systems in production. • Understanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm. • Experience with SOA. • Knowledge of graph database neo4j, Titan is definitely a plus. • Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system.

Job posted by
apply for job
apply for job
Lakshman Dornala picture
Lakshman Dornala
Job posted by
Lakshman Dornala picture
Lakshman Dornala
Apply for job
apply for job

Hadoop Lead Engineers

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
7 - 9 years
Experience icon
27 - 34 lacs/annum

Position Description Assists in providing guidance to small groups of two to three engineers, including offshore associates, for assigned Engineering projects Demonstrates up-to-date expertise in Software Engineering and applies this to the development, execution, and improvement of action plans Generate weekly, monthly and yearly report using JIRA and Open source tools and provide updates to leadership teams. Proactively identify issues, identify root cause for the critical issues. Work with cross functional teams, Setup KT sessions and mentor the team members. Co-ordinate with Sunnyvale and Bentonville teams. Models compliance with company policies and procedures and supports company mission, values, and standards of ethics and integrity Provides and supports the implementation of business solutions Provides support to the business Troubleshoots business, production issues and on call support. Minimum Qualifications BS/MS in Computer Science or related field 8+ years’ experience building web applications Solid understanding of computer science principles Excellent Soft Skills Understanding the major algorithms like searching and sorting Strong skills in writing clean code using languages like Java and J2EE technologies. Understanding how to engineer the RESTful, Micro services and knowledge of major software patterns like MVC, Singleton, Facade, Business Delegate Deep knowledge of web technologies such as HTML5, CSS, JSON Good understanding of continuous integration tools and frameworks like Jenkins Experience in working with the Agile environments, like Scrum and Kanban. Experience in dealing with the performance tuning for very large-scale apps. Experience in writing scripting using Perl, Python and Shell scripting. Experience in writing jobs using Open source cluster computing frameworks like Spark Relational database design experience- MySQL, Oracle, SOLR, NoSQL - Cassandra, Mango DB and Hive. Aptitude for writing clean, succinct and efficient code. Attitude to thrive in a fun, fast-paced start-up like environment

Job posted by
apply for job
apply for job
Sampreetha Pai picture
Sampreetha Pai
Job posted by
Sampreetha Pai picture
Sampreetha Pai
Apply for job
apply for job

Bigdata

Founded 2017
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Hyderabad
Experience icon
0 - 1 years
Experience icon
1 - 1 lacs/annum

Bigdata, Business intelligence , python, R with their skills

Job posted by
apply for job
apply for job
Jasmine Shaik picture
Jasmine Shaik
Job posted by
Jasmine Shaik picture
Jasmine Shaik
Apply for job
apply for job

Principal Member Technical Staff (SDE3)

Founded 2005
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
8 - 15 years
Experience icon
25 - 50 lacs/annum

About [24]7 Innovation Labs Data is changing human lives at the core - we collect so much data about everything, use it to learn many things and apply the learnings in all aspects of our lives. [24]7 is at the fore front of applying data and machine learning to the world of customer acquisition and customer engagement. Our customer acquisition cloud uses best of ML and AI to get the right audiences and our engagement cloud powers the interactions for best experience. We service Fortune 100 enterprises globally and hundreds of millions of their customers every year. We enable 1.5B customer interactions every year. We work on several challenging problems in the world of data processing, machine learning and use artificial intelligence to power Smart Agents. How do you process millions of events in a stream to derive intelligence? How do you learn from troves of data applying scalable machine learning algorithms? How do you switch the learnings with real time streams to make decisions in sub 300 msec at scale? We work with the best of open source technologies - Akka, Scala, Undertow, Spark, Spark ML, Hadoop, Cassandra, Mongo. Platform scale and real time are in our DNA and we work hard every day to change the game in customer engagement. We believe in empowering smart people to work on larger than life problems with the best of technologies and like-minded people. We are a Pre-IPO Silicon Valley based company with many global brands as our customers – Hilton, eBay, Time Warner Cable, Best Buy, Target, American Express, Capital One and United Airlines. We touch more than 300 M visitors online every month with our technologies. We have one of the best work environments in Bangalore. Opportunity Principal Member of Technical Staff is one of our distinguished individual contributors who can takes on problems of size and scale. You will be responsible for working with a team of smart and highly capable engineers to design a solution and work closely in the implementation, testing, deployment and runtime operation 24x7 with 99.99% uptime. You will have to demonstrate your technical mettle and influence and inspire the engineers to build things right. You will be working on the problems in one or more areas of : Data Collection: Horizontally scalable platform to collect Billions of events from around the world in as little as 50 msec. Intelligent Campaign Engines: Make real time decisions using the events on best experience to display in as little as 200 msec. Real time Stream Computation: Compute thousands of metrics on the incoming billions of events to make it available for decisioning and analytics. Data Pipeline: Scaleable data transport layer using Apache Kafka running across hundreds of servers and transporting billions of events in real time. Data Analysis: Distributed OLAP engines on Hadoop or Spark to provide real time analytics on the data Large scale Machine Learning: Supervised and Unsupervised learning on Hadoop and Spark using the best of open source frameworks. In this role, you will be presenting your work at Meetup events, Conferences worldwide and contributing to Open Source. You will be helping with attracting the right talent and grooming the engineers to shape up to be the best. Must Have Engineering • Strong foundation in Computer Science - through education and/or experience - Data Structures, Algorithms, Design thinking, Optimizations. • Should have been an outstanding technical contributor with accomplishments include building products and platforms of scale. • Outstanding technical acumen and deep understanding of problems with distributed systems and scale with strong orientation towards open source. • Experience building platforms that have 99.99% uptime requirements and have scale. • Experience in working in a fast paced environment with attention to detail and incremental delivery through automation. • Loves to code than to talk. • 10+ years of experience in building software systems or able to demonstrate such maturity without the years under the belt. Behavioral • Loads of energy and can-do attitude to take BIG problems by their horns and solve them. • Entrepreneurial spirit to conceive ideas, turn challenges into opportunities and build products. • Ability to inspire other engineers to do the unimagined and go beyond their comfort lines. • Be a role model for upcoming engineers in the organization especially new college grads. Technology background • Strong preference with experience in open source technologies: working with various Java application servers or Scala • Experience in deploying web applications, services that run across thousands of servers globally with very low latency and high uptime.

Job posted by
apply for job
apply for job
Achappa Bheemaiah picture
Achappa Bheemaiah
Job posted by
Achappa Bheemaiah picture
Achappa Bheemaiah
Apply for job
apply for job

Senior Engineer - Java/Hadoop

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 10 years
Experience icon
20 - 50 lacs/annum

Our Company We help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. @ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. Our Team The Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc. Our team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support. Your Opportunity As part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers. Your Responsibility • Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale • Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. • Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. • Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums • Develop and/or Contribute to add features that enable customer analytics at Walmart scale • Deploy and monitor products on Cloud platforms • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs Our Ideal Candidate You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. Your Qualifications • Bachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field • Expertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc. • Expertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc). • Experience in building scalable/highly available distributed systems in production. • Understanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm. • Experience with SOA. • Knowledge of graph database neo4j, Titan is definitely a plus. • Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system.

Job posted by
apply for job
apply for job
Lakshman Dornala picture
Lakshman Dornala
Job posted by
Lakshman Dornala picture
Lakshman Dornala
Apply for job
apply for job

Engineering Manager
at Uber

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
9 - 15 years
Experience icon
50 - 80 lacs/annum

Minimum 5+ years of experience as a manager and overall 10+ years of industry experience in a variety of contexts, during which you've built scalable, robust, and fault-tolerant systems. You have a solid knowledge of the whole web stack: front-end, back-end, databases, cache layer, HTTP protocol, TCP/IP, Linux, CPU architecture, etc. You are comfortable jamming on complex architecture and design principles with senior engineers. Bias for action. You believe that speed and quality aren't mutually exclusive. You've shown good judgement about shipping as fast as possible while still making sure that products are built in a sustainable, responsible way. Mentorship/ Guidance. You know that the most important part of your job is setting the team up for success. Through mentoring, teaching, and reviewing, you help other engineers make sound architectural decisions, improve their code quality, and get out of their comfort zone. Commitment. You care tremendously about keeping the Uber experience consistent for users and strive to make any issues invisible to riders. You hold yourself personally accountable, jumping in and taking ownership of problems that might not even be in your team's scope. Hiring know-how. You're a thoughtful interviewer who constantly raises the bar for excellence. You believe that what seems amazing one day becomes the norm the next day, and that each new hire should significantly improve the team. Design and business vision. You help your team understand requirements beyond the written word and you thrive in an environment where you can uncover subtle details.. Even in the absence of a PM or a designer, you show great attention to the design and product aspect of anything your team ships.

Job posted by
apply for job
apply for job
Swati Singh picture
Swati Singh
Job posted by
Swati Singh picture
Swati Singh
Apply for job
apply for job

Data Science Engineer (SDE I)

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
1 - 3 years
Experience icon
12 - 20 lacs/annum

Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users. For this role, credible display of innovation in past projects (or academia) is a must. We are looking for a candidate who lives and talks Data & Algorithms, love to play with BigData engineering, hands-on with Apache Spark, Kafka, RDBMS/NoSQL DBs, Big Data Analytics and handling Unix & Production Server. Tier-1 college (BE from IITs, BITS-Pilani, top NITs, IIITs or MS in Stanford, Berkley, CMU, UW–Madison) or exceptionally bright work history is a must. Let us know if this interests you to explore the profile further.

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

Staff Engineer - Java/Hadoop

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
8 - 14 years
Experience icon
30 - 70 lacs/annum

Our Company We help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. @ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. Our Team The Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc. Our team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support. Your Opportunity As part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers. Your Responsibility • Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale • Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. • Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. • Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums • Develop and/or Contribute to add features that enable customer analytics at Walmart scale • Deploy and monitor products on Cloud platforms • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs Our Ideal Candidate You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. Your Qualifications • Bachelor's Degree and 8+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field • Expertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc. • Expertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc). • Experience in building scalable/highly available distributed systems in production. • Understanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm. • Experience with SOA. • Knowledge of graph database neo4j, Titan is definitely a plus. • Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system.

Job posted by
apply for job
apply for job
Lakshman Dornala picture
Lakshman Dornala
Job posted by
Lakshman Dornala picture
Lakshman Dornala
Apply for job
apply for job

Big Data Engineer

Founded 2015
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Noida, NCR (Delhi | Gurgaon | Noida)
Experience icon
3 - 7 years
Experience icon
5 - 12 lacs/annum

Together we will create wonderful solutions which deliver value for us and our customers.

Job posted by
apply for job
apply for job
Sneha Pandey picture
Sneha Pandey
Job posted by
Sneha Pandey picture
Sneha Pandey
Apply for job
apply for job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.