Loading...

{{notif_text}}

The next CutShort event, {{next_cs_event.name}}, will happen on {{next_cs_event.startDate | date: 'd MMMM'}}The next CutShort event, {{next_cs_event.name}}, will begin in a few hoursThe CutShort event, {{next_cs_event.name}}, is LIVE.Join now!
{{hours_remaining}}:{{minutes_remaining}}:{{seconds_remaining}}Want to save 90% of your recruiting time? Learn how in our next webinar on 22nd March at 3 pmLearn more

Backend Developer
Posted by Anuj Kumar Kodam

apply to this job

Locations

Bengaluru (Bangalore)

Experience

2 - 4 years

Salary

INR 13L - 18L

Skills

Elastic Search
MongoDB
NOSQL Databases
Redis
Relational Database (RDBMS)

Job description

At Poker Yoga we aim to make poker a tool towards self transformation. By providing the necessary tools to improve his skill, necessary learning frame work to bring skill to the core of his game approach and experiences to enhance his perception. We are looking at passionate coders who love building products that speak for themselves. It's an invitation to join a family, not a company. Looking forward to work with you!

About the company

undefined

Founded

Type

Product

Size

1-5 employees

Stage

Bootstrapped
View company

Similar jobs

"Senior Engineer - Hadoop"

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 10 years
Experience icon
25 - 50 lacs/annum

"Senior Engineer - Development\n\nOur Company\nWe help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. \n\n@ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. \n\nOur Team\nThe Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc.\n \nOur team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support.\n \nYour Opportunity\nAs part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers.\n\nYour Responsibility\n•\tDesign, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale \n•\tInteract with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community.\n•\tEngage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios.\n•\tIdentify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums\n•\tDevelop and/or Contribute to add features that enable customer analytics at Walmart scale\n•\tDeploy and monitor products on Cloud platforms\n•\tDevelop and implement best-in-class monitoring processes to enable data applications meet SLAs \n\nOur Ideal Candidate\nYou have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. \nYour Qualifications\n•\tBachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field \n•\tExpertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc.\n•\tExpertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc).\n•\tExperience in building scalable/highly available distributed systems in production.\n•\tUnderstanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm.\n•\tExperience with SOA. \n•\tKnowledge of graph database neo4j, Titan is definitely a plus. \n•\tKnowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system."

Job posted by
apply for job
apply for job
Lakshman Dornala picture
Lakshman Dornala
Job posted by
Lakshman Dornala picture
Lakshman Dornala
Apply for job
apply for job

"Senior Engineer - Java/Hadoop"

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
6 - 10 years
Experience icon
20 - 50 lacs/annum

"Our Company\nWe help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. \n\n@ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. \n\nOur Team\nThe Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc.\n \nOur team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support.\n \nYour Opportunity\nAs part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers.\n\nYour Responsibility\n•\tDesign, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale \n•\tInteract with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community.\n•\tEngage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios.\n•\tIdentify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums\n•\tDevelop and/or Contribute to add features that enable customer analytics at Walmart scale\n•\tDeploy and monitor products on Cloud platforms\n•\tDevelop and implement best-in-class monitoring processes to enable data applications meet SLAs \n\nOur Ideal Candidate\nYou have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. \nYour Qualifications\n•\tBachelor's Degree and 7+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field \n•\tExpertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc.\n•\tExpertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc).\n•\tExperience in building scalable/highly available distributed systems in production.\n•\tUnderstanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm.\n•\tExperience with SOA. \n•\tKnowledge of graph database neo4j, Titan is definitely a plus. \n•\tKnowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system."

Job posted by
apply for job
apply for job
Lakshman Dornala picture
Lakshman Dornala
Job posted by
Lakshman Dornala picture
Lakshman Dornala
Apply for job
apply for job

"Staff Engineer - Java/Hadoop"

Founded 2012
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[4 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
8 - 14 years
Experience icon
30 - 70 lacs/annum

"Our Company\n\nWe help people around the world save money and live better -- anytime and anywhere -- in retail stores, online and through their mobile devices. Each week, more than 220 million customers and members visit our 11,096 stores under 69 banners in 27 countries and e-commerce websites in 10 countries. With last fiscal revenues of approximately $486 billion, Walmart employs 2.2 million employees worldwide. \n\n@ Walmart Labs in Bangalore, we use technology for the charter of building brand new platforms and services on the latest technology stack to support both our stores and e-commerce businesses worldwide. \n\nOur Team\n\nThe Global Data and Analytics Platforms (GDAP) team @ Walmart Labs in Bangalore provides Data Foundation Infrastructure, Visualization Portal, Machine Learning Platform, Customer platform and Data Science products that form part of core platforms and services that drive Walmart business. The group also develops analytical products for several verticals like supply chain, pricing, customer, HR etc.\n \nOur team which is part of GDAP Bangalore is responsible for creating the Customer Platform which is a one stop shop for all customer analytics for Walmart stores, a Machine Learning Platform that provides end-to-end infrastructure for Data Scientists to build ML solutions and an Enterprise Analytics group that provides analytics for HR, Global Governance and Security. The team is responsible for time critical, business critical and highly reliable systems that influence almost every part of the Walmart business. The team is spread over multiple locations and the Bangalore centre owns critical end to end pieces, that we design, build and support.\n \nYour Opportunity\n\nAs part of the Customer Analytics Team @Walmart Labs, you’ll have the opportunity to make a difference by being a part of development team that builds products at Walmart scale, which is the foundation of Customer Analytics across Walmart. One of the key attribute of this job is that you are required to continuously innovate and apply technology to provide business 360 view of Walmart customers.\n\nYour Responsibility\n\n•\tDesign, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale \n•\tInteract with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community.\n•\tEngage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios.\n•\tIdentify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums\n•\tDevelop and/or Contribute to add features that enable customer analytics at Walmart scale\n•\tDeploy and monitor products on Cloud platforms\n•\tDevelop and implement best-in-class monitoring processes to enable data applications meet SLAs \n\nOur Ideal Candidate\n\nYou have a deep interest and passion for technology. You love writing and owning codes and enjoy working with people who will keep challenging you at every stage. You have strong problem solving, analytic, decision-making and excellent communication with interpersonal skills. You are self-driven and motivated with the desire to work in a fast-paced, results-driven agile environment with varied responsibilities. \n\nYour Qualifications\n•\tBachelor's Degree and 8+ yrs. of experience or Master’s Degree with 6+ years of experience in Computer Science or related field \n•\tExpertise in Big Data Ecosystem with deep experience in Java, Hadoop, Spark, Storm, Cassandra, NoSQL etc.\n•\tExpertise in MPP architecture and knowledge of MPP engine (Spark, Impala etc).\n•\tExperience in building scalable/highly available distributed systems in production.\n•\tUnderstanding of stream processing with expert knowledge on Kafka and either Spark streaming or Storm.\n•\tExperience with SOA. \n•\tKnowledge of graph database neo4j, Titan is definitely a plus. \n•\tKnowledge of Software Engineering best practices with experience on implementing CI/CD, Log aggregation/Monitoring/alerting for production system."

Job posted by
apply for job
apply for job
Lakshman Dornala picture
Lakshman Dornala
Job posted by
Lakshman Dornala picture
Lakshman Dornala
Apply for job
apply for job

"Lead Data Engineer (SDE III)"

Founded 2017
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
5 - 8 years
Experience icon
25 - 55 lacs/annum

"Couture.ai is building a patent-pending AI platform targeted towards vertical-specific solutions. The platform is already licensed by Reliance Jio and few European retailers, to empower real-time experiences for their combined >200 million end users.\n\nFor this role, credible display of innovation in past projects is a must.\nWe are looking for hands-on leaders in data engineering with the 5-11 year of research/large-scale production implementation experience with:\n- Proven expertise in Spark, Kafka, and Hadoop ecosystem.\n- Rock-solid algorithmic capabilities.\n- Production deployments for massively large-scale systems, real-time personalization, big data analytics and semantic search.\n- Expertise in Containerization (Docker, Kubernetes) and Cloud Infra, preferably OpenStack.\n- Experience with Spark ML, Tensorflow (& TF Serving), MXNet, Scala, Python, NoSQL DBs, Kubernetes, ElasticSearch/Solr in production.\n\nTier-1 college (BE from IITs, BITS-Pilani, IIITs, top NITs, DTU, NSIT or MS in Stanford, UC, MIT, CMU, UW–Madison, ETH, top global schools) or exceptionally bright work history is a must. \n\nLet us know if this interests you to explore the profile further."

Job posted by
apply for job
apply for job
Shobhit Agarwal picture
Shobhit Agarwal
Job posted by
Shobhit Agarwal picture
Shobhit Agarwal
Apply for job
apply for job

"Enthusiastic Cloud-ML Engineers with a keen sense of curiosity"

Founded 2012
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
3 - 12 years
Experience icon
3 - 25 lacs/annum

"We are a start-up in India seeking excellence in everything we do with an unwavering curiosity and enthusiasm. We build simplified new-age AI driven Big Data Analytics platform for Global Enterprises and solve their biggest business challenges. Our Engineers develop fresh intuitive solutions keeping the user in the center of everything.\n\nAs a Cloud-ML Engineer, you will design and implement ML solutions for customer use cases and problem solve complex technical customer challenges.\n\nExpectations and Tasks\n\n- Total of 7+ years of experience with minimum of 2 years in Hadoop technologies like HDFS, Hive, MapReduce\n- Experience working with recommendation engines, data pipelines, or distributed machine learning and experience with data analytics and data visualization techniques and software.\n- Experience with core Data Science techniques such as regression, classification or clustering, and experience with deep learning frameworks \n- Experience in NLP, R and Python \n- Experience in performance tuning and optimization techniques to process big data from heterogeneous sources.\n- Ability to communicate clearly and concisely across technology and the business teams.\n- Excellent Problem solving and Technical troubleshooting skills.\n- Ability to handle multiple projects and prioritize tasks in a rapidly changing environment.\n\nTechnical Skills\nCore Java, Multithreading, Collections, OOPS, Python, R, Apache Spark, MapReduce, Hive, HDFS, Hadoop, MongoDB, Scala\n\nWe are a retained Search Firm employed by our client - Technology Start-up @ Bangalore. Interested candidates can share their resumes with me - Jia@TalentSculpt.com. I will respond to you within 24 hours. Online assessments and pre-employment screening are part of the selection process."

Job posted by
apply for job
apply for job
Blitzkrieg HR Consulting picture
Blitzkrieg HR Consulting
Job posted by
Blitzkrieg HR Consulting picture
Blitzkrieg HR Consulting
Apply for job
apply for job

"Big Data Evangelist"

Founded 2016
Products and services{{j_company_types[3 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Noida
Experience icon
2 - 6 years
Experience icon
4 - 12 lacs/annum

"Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn."

Job posted by
apply for job
apply for job
Suchit Majumdar picture
Suchit Majumdar
Job posted by
Suchit Majumdar picture
Suchit Majumdar
Apply for job
apply for job

"Python Developer"

Founded 2015
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[3 - 1]}} employees
{{j_company_stages[3 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Pune
Experience icon
2 - 5 years
Experience icon
5 - 10 lacs/annum

"We are an early stage startup working in the space of analytics, big data, machine learning, data visualization on multiple platforms and SaaS.\nWe have our offices in Palo Alto and WTC, Kharadi, Pune and got some marque names as our customers.\n\nWe are looking for really good Python programmer who MUST have scientific programming experience (Python, etc.)\nHands-on with numpy and the Python scientific stack is a must.\nDemonstrated ability to track and work with 100s-1000s of files and GB-TB of data.\nExposure to ML and Data mining algorithms.\nNeed to be comfortable working in a Unix environment and SQL.\n\nYou will be required to do following:\nUsing command line tools to perform data conversion and analysis\nSupporting other team members in retrieving and archiving experimental results\nQuickly writing scripts to automate routine analysis tasks\nCreating insightful, simple graphics to represent complex trends\nExplore/design/invent new tools and design patterns to solve complex big data problems\n\nExperience working on a long-term, lab-based project (academic experience acceptable)"

Job posted by
apply for job
apply for job
Nischal Vohra picture
Nischal Vohra
Job posted by
Nischal Vohra picture
Nischal Vohra
Apply for job
apply for job

"Full Stack Developer"

Founded 2015
Products and services{{j_company_types[0 - 1]}}
{{j_company_sizes[0 - 1]}} employees
{{j_company_stages[0 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bengaluru (Bangalore)
Experience icon
4 - 10 years
Experience icon
6 - 12 lacs/annum

"We are looking for a complete stack developer. You will build micro-services with rest api and convert UX designs into highly quality UIs.\n\nMust have developed large complex system using node.js, angular, mongo, Rest API\nMust know linux, git \nMust know software design\nMust be a good communicator and work across a distributed team"

Job posted by
apply for job
apply for job
Madhu Konety picture
Madhu Konety
Job posted by
Madhu Konety picture
Madhu Konety
Apply for job
apply for job

"Fullstack Developer"

Founded
Products and services{{j_company_types[ - 1]}}
{{j_company_sizes[ - 1]}} employees
{{j_company_stages[ - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Chennai
Experience icon
1 - 3 years
Experience icon
3 - 6 lacs/annum

"At Daddyswallet, we’re using today’s technology to bring significant disruptive innovation to the financial industry. We focus on improving the lives of consumers by delivering simple, honest and transparent financial products.Looking for Fullstack developer having skills mainly in React native,react js.python.node js."

Job posted by
apply for job
apply for job
Pruthiraj Rath picture
Pruthiraj Rath
Job posted by
Pruthiraj Rath picture
Pruthiraj Rath
Apply for job
apply for job

"Big Data Developer"

Founded 2008
Products and services{{j_company_types[1 - 1]}}
{{j_company_sizes[2 - 1]}} employees
{{j_company_stages[2 - 1]}}
{{rendered_skills_map[skill] || skill}}
Location icon
Bangalore, Bengaluru (Bangalore)
Experience icon
1 - 7 years
Experience icon
0 - 0 lacs/annum

"Develop analytic tools, working on BigData and Distributed systems. \n- Provide technical leadership on developing our core Analytic platform\n- Lead development efforts on product features using Scala/Java\n-Demonstrable excellence in innovation, problem solving, analytical skills, data structures and design patterns\n- Expert in building applications using Spark and Spark Streaming\n-Exposure to NoSQL: HBase/Cassandra, Hive and Pig -Latin, Mahout\n-Extensive experience with Hadoop and Machine learning algorithms"

Job posted by
apply for job
apply for job
Katreddi Kiran Kumar picture
Katreddi Kiran Kumar
Job posted by
Katreddi Kiran Kumar picture
Katreddi Kiran Kumar
Apply for job
apply for job
Want to apply for this role at Poker Yoga?
Hiring team responds within a day
apply for this job
Why apply via CutShort?
Connect with actual hiring teams and get their fast response. No 3rd party recruiters. No spam.