



Location: Bangalore/Pune/Hyderabad/Nagpur
4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development, Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of using Python/Perl/Shell
Please note - Hbase hive and spark are must.

Similar jobs
Position : Software Engineer (Java Backend Engineer)
Experience : 4+ Years
📍 Location : Bangalore, India (Hybrid)
Mandatory Skills : Java 8+ (Advanced Features), Spring Boot, Apache Spark (Spark Streaming), SQL & Cosmos DB, Git, Maven, CI/CD (Jenkins, GitHub), Azure Cloud, Agile Scrum.
About the Role :
We are seeking a highly skilled Backend Engineer with expertise in Java, Spark, and microservices architecture to join our dynamic team. The ideal candidate will have a strong background in object-oriented programming, experience with Spark Streaming, and a deep understanding of distributed systems and cloud technologies.
Key Responsibilities :
- Design, develop, and maintain highly scalable microservices and optimized RESTful APIs using Spring Boot and Java 8+.
- Implement and optimize Spark Streaming applications for real-time data processing.
- Utilize advanced Java 8 features, including:
- Functional interfaces & Lambda expressions
- Streams and Parallel Streams
- Completable Futures & Concurrency API improvements
- Enhanced Collections APIs
- Work with relational (SQL) and NoSQL (Cosmos DB) databases, ensuring efficient data modeling and retrieval.
- Develop and manage CI/CD pipelines using Jenkins, GitHub, and related automation tools.
- Collaborate with cross-functional teams, including Product, Business, and Automation, to deliver end-to-end product features.
- Ensure adherence to Agile Scrum practices and participate in code reviews to maintain high-quality standards.
- Deploy and manage applications in Azure Cloud environments.
Minimum Qualifications:
- BS/MS in Computer Science or a related field.
- 4+ Years of experience developing backend applications with Spring Boot and Java 8+.
- 3+ Years of hands-on experience with Git for version control.
- Strong understanding of software design patterns and distributed computing principles.
- Experience with Maven for building and deploying artifacts.
- Proven ability to work in Agile Scrum environments with a collaborative team mindset.
- Prior experience with Azure Cloud Technologies.
Location: Pune
Required Skills : Scala, Python, Data Engineering, AWS, Cassandra/AstraDB, Athena, EMR, Spark/Snowflake
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite


We have urgent requirement of Data Engineer/Sr Data Engineer for reputed MNC company.
Exp: 4-9yrs
Location: Pune/Bangalore/Hyderabad
Skills: We need candidate either Python AWS or Pyspark AWS or Spark Scala
- 3+ years of SDE work experience from Product based companies
- Experience in Java, Spring Boot, MySQL, Kafka, Hbase, AWS
- Experience in Multi threading, distributed systems, Best practices of coding, scaling
About Vymo
Vymo is a Sanfrancisco-based next-generation Sales productivity SaaS company with offices in 7 locations. Vymo is funded by top tier VC firms like Emergence Capital and Sequoia Capital. Vymo is a category creator, an intelligent Personal Sales Assistant who captures sales activities automatically, learns from top performers, and predicts ‘next best actions’ contextually. Vymo has 100,000 users in 60+ large enterprises such as AXA, Allianz, Generali.Vymo has seen 3x annual growth over the last few years and aspires to do even better this year by building up the team globally.
What is the Personal Sales Assistant
A game-changer! We thrive in the CRM space where every company is struggling to deliver meaningful engagement to their Sales teams and IT systems. Vymo was engineered with a mobile-first philosophy. The platform through AI/ML detects, predicts, and learns how to make Sales Representatives more productive through nudges and suggestions on a mobile device. Explore Vymo https://getvymo.com/">https://getvymo.com/
What you will do at Vymo
From young open source enthusiasts to experienced Googlers, this team develops products like Lead Management System, Intelligent Allocations & Route mapping, Intelligent Interventions, that help improve the effectiveness of the sales teams manifold. These products power the "Personal Assistant" app that automates the sales force activities, leveraging our cutting edge location based technology and intelligent routing algorithms.
A Day in your Life
- Design, develop and maintain robust data platforms on top of Kafka, Spark, ES etc.
- Provide leadership to a group of engineers in an innovative and fast-paced environment.
- Manage and drive complex technical projects from the planning stage through execution.
What you would have done
- B.E (or equivalent) in Computer Sciences
- 6-9 years of experience building enterprise class products/platforms.
- Knowledge of Big data systems and/or Data pipeline building experience is preferred.
- 2-3 years of relevant work experience as technical lead or technical management experience.
- Excellent coding skills in one of Core Java or NodeJS
- Demonstrated problem solving skills in previous roles.
- Good communication skills.



