Our Company We help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. @ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. Our Team The Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc. Our team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support. Your Opportunity As part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers. Your Responsibility • Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale • Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. • Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. • Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums • Develop and/or Contribute to add features that enable customer analytics at Walmart scale • Deploy and monitor products on Cloud platforms • Develop and implement best-in-class monitoring processes to enable data applications meet SLAs Our Ideal Candidate You have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. Your Qualifications • Bachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field • Expertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc. • Expertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc). • Experience in building scalable/highly available distributed systems in production. • Understanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm. • Experience with SOA. • Knowledge of graph database neo4j, Titan is definitely a plus. • Knowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system.
www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!
Brief About the Company EdGE Networks Pvt. Ltd. is an innovative HR technology solutions provider focused on helping organizations meet their talent-related challenges. With our expertise in Artificial Intelligence, Semantic Analysis, Data Science, Machine Learning and Predictive Modelling, we enable HR organizations to lead with data and intelligence. Our solutions significantly improve workforce availability, billing, allocation and drive straight bottom line impacts. For more details, please logon to www.edgenetworks.in and www.hirealchemy.com Do apply if you meet most of the following requirements. Very strong in Python, Java or Scala experience, especially in an open source, data-intensive, distributed environments Work experience in Libraries like Scikit-learn, numpy, scipy, cython. Expert in Spark, MapReduce, Pig, Hive, Kafka, Storm, etc. including performance tuning. Implemented complex projects dealing with the considerable data size and with high complexity Good understanding of algorithms, data structure, and performance optimization techniques. Excellent problem solver, analytical thinker, and a quick learner. Search capabilities such as ElasticSearch with experience in MongoDB Nice to have: Must have excellent written and verbal communication skills Have experience writing Spark and/or Map Reduce V2 Be able to translate from requirements and or specifications to code that is relatively bug-free. Write unit and integration tests Knowledge of c++. Knowledge of Theano, Tensorflow, Caffe, Torch etc.
Good understanding of OOP with Strong Coding skills in Java. Database knowledge: good knowledge of SQL queries. Should have worked on highly distributed systems. Strong in Data Structures & Algorithms, Analytical and Problem solving skills. Hands on experience in GNU/Linux (Unix env). Able to work with minimal supervision and in a team environment.