Cutshort logo
Apache Kafka Jobs in Bangalore (Bengaluru)

50+ Apache Kafka Jobs in Bangalore (Bengaluru) | Apache Kafka Job openings in Bangalore (Bengaluru)

Apply to 50+ Apache Kafka Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

icon
Radisys India

at Radisys India

1 recruiter
Sagar bh
Posted by Sagar bh
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹35L / yr
skill iconJava
J2EE
skill iconSpring Boot
Microservices
Apache Kafka
+3 more

Radisys Corporation is looking for JAVA Backend developers with 6-10 years of experience for their Bangalore location.


The ideal candidate will be able to design and develop code for tasks after brainstorming sessions and applying best practices and coding conventions.


This position requires experience in Java, Spring, Spring Boot, microservices, message broker, and DB knowledge. Candidates should be skilled in developing enterprise applications that consist of FE, BE, and DB integration.


If you have experience with Docker and Kubernetes, that's an added advantage.

Read more
Radisys India

at Radisys India

1 recruiter
Sai Kiran
Posted by Sai Kiran
Bengaluru (Bangalore)
5 - 10 yrs
₹5L - ₹25L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconMongoDB
+4 more

Radisys Corporation, a global leader in open telecom solutions, enables service providers to drive disruption with new open architecture business models. Our innovative technology solutions leverage open reference architectures and standards, combined with open software and hardware, to power business transformation for the telecom industry. Our services organization delivers systems integration expertise necessary to solve complex deployment challenges for communications and content providers.


Job Overview :


We are looking for a Lead Engineer - Java with a strong background in Java development and hands-on experience with J2EE, Springboot, Kubernetes, Microservices, NoSQL, and SQL. As a Lead Engineer, you will be responsible for designing and developing high-quality software solutions and ensuring the successful delivery of projects. role with 7 to 10 years of experience, based in Bangalore, Karnataka, India. This position is a full-time role with excellent growth opportunities.


Qualifications and Skills :


- Bachelor's or master's degree in Computer Science or a related field


- Strong knowledge of Core Java, J2EE, and Springboot frameworks


- Hands-on experience with Kubernetes and microservices architecture


- Experience with NoSQL and SQL databases


- Proficient in troubleshooting and debugging complex system issues


- Experience in Enterprise Applications


- Excellent communication and leadership skills


- Ability to work in a fast-paced and collaborative environment


- Strong problem-solving and analytical skills


Roles and Responsibilities :


- Work closely with product management and cross-functional teams to define requirements and deliverables


- Design scalable and high-performance applications using Java, J2EE, and Springboot


- Develop and maintain microservices using Kubernetes and containerization


- Design and implement data models using NoSQL and SQL databases


- Ensure the quality and performance of software through code reviews and testing


- Collaborate with stakeholders to identify and resolve technical issues


- Stay up-to-date with the latest industry trends and technologies


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune, Mumbai, Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Messaging
+5 more

WISSEN TECHNOLOGY is Hiring!!!!!


Java Developer with messaging experience (JMS/EMS/Kafka/RabbitMQ) And CI/CD.


Exp-3 to 6 yrs

Location-Pune|Mumbai|Bangalore

NP- Serving and less than 15 days only.

Requirement:

Core Java 8.0

Mandatory experience in any of the messaging technologies like JMS/EMS/Kafka/RabbitMQ

Extensive experience in developing enterprise-scale n-tier applications for the financial domain.

Should possess good architectural knowledge and be aware of enterprise application design patterns. 

Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.

Mandatory development experience on CI/CD platform.

Good knowledge of multi-threading and high-volume server side development

Experience in sales and trading platforms in investment banking/capital markets

Basic working knowledge of Unix/Linux. 

Strong written and oral communication skills. Should have the ability to express their design ideas and thoughts.

Read more
Egen Solutions
Anshul Saxena
Posted by Anshul Saxena
Remote, Hyderabad, Ahmedabad, Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Kolkata, Indore, Bhopal, Kochi (Cochin), Chennai, Bengaluru (Bangalore), Pune
3 - 5 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconKotlin
+3 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.


You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.


Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
Arroz Technology

at Arroz Technology

2 candid answers
Amogh Saxena
Posted by Amogh Saxena
Bengaluru (Bangalore)
1 - 3 yrs
₹1L - ₹5L / yr
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconMongoDB
+5 more

Job Description: Full Stack Developer Company: Arroz Technology Private Limited CTC: 5 LPA

Location : Bangalore (Onsite)

Responsibilities:

- Design and develop scalable and high-performance web applications using the

MERN (MongoDB, Express.js, React.js, Node.js) stack.

- Collaborate with cross-functional teams to gather requirements and translate them into high-level designs.

- Write clean, reusable, and well-structured code following industry best practices and coding standards.

- Mentor and guide junior developers, providing technical expertise and promoting Professional growth.

- Conduct code reviews and provide constructive feedback to ensure code quality and adherence to standards.

- Collaborate with frontend and backend developers to integrate components and ensure smooth data flow.

- Work with UI/UX designers to implement responsive and user-friendly interfaces.

- Stay updated with the latest trends and advancements in full-stack development technologies.

- Work in a 10 AM to 6 PM, six-day office role, maintaining regular attendance and punctuality.

Required Skills and Qualifications:

-Strong proficiency in MERN (MongoDB, Express.js, React.js, Node.js) stack development.

-Experience with Redux or similar state management libraries.

-Solid understanding of front-end technologies such as HTML, CSS, and JavaScript.

-Proficiency in RESTful API development and integration.

-Familiarity with version control systems like Git and agile development methodologies.

-Good problem-solving and debugging skills.

-Excellent communication and teamwork abilities.

-Bachelor's degree in Computer Science or a related field (preferred).

Join Arroz Technology Private Limited as a Full Stack Developer and contribute to the development of cutting-edge web applications. This role offers competitive compensation and growth opportunities within a dynamic work environment. 

Read more
Prolifics Corporation Ltd.,

at Prolifics Corporation Ltd.,

1 video
1 recruiter
Sandhya Patel
Posted by Sandhya Patel
Bengaluru (Bangalore)
5 - 9 yrs
Best in industry
skill iconJava
Spring Boot
Apache Kafka
Azure

Job Description:


Organization - Prolifics Corporation


Skill - Java developer


Job type - Full time/Permanent


Location - Bangalore/Mumbai


Experience - 5 to 10 Years


Notice Period – Immediate to 30 Days



Required Skillset:


Spring framework concepts, Spring boot(Mandatory)

Spring batch and dashboard

Apache Kafka(Mandatory)

Azure (Mandatory)

GIT / Maven / Griddle / CI/CD

MS SQL database

Cloud and Data Exposure


Docker, Orchestration using Kubernetes

Genesys pure cloud or any cloud-based contact center platform that can be used to manage customer interactions.

Technical Experience:


The candidate should have 5+ years of experience, preferably at technology or financial firm.

Must have at least 2- 3 years of experience in spring batch / java / Kafka / SQL

Must have hands on experience in database tools and technologies.

Must have exposure to CI / CD and Cloud.

Work scope


Build the spring batch framework to pull the required data from Genesys

Cloud to MS reporting data storage – on prem / Cloud.

Build MS WM Contact Center Data Hub (on Prem / Cloud)

Build dashboard to monitor and manage the data injection, fusion jobs.

Event bridge implementation for real time data ingestion and monitoring

MS Private Cloud

Read more
Molecular Connections

at Molecular Connections

4 recruiters
Molecular Connections
Posted by Molecular Connections
Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹20L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
Kapture CX

at Kapture CX

1 video
Arunashree JS
Posted by Arunashree JS
Bengaluru (Bangalore)
3 - 4 yrs
₹8L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Apache Kafka
+3 more

Roles and Responsibilities:


  • Proven experience in Java 8, Spring Boot, Microservices and API
  • Strong experience with Kafka, Kubernetes
  • strong experience in using RDBMS (Mysql) and NoSQL.
  • Experience working in Eclipse or Maven environments
  • Hands-on experience in Unix and Shell scripting
  • hands-on experience in fine-tuning application response and performance testing.
  • experience in Web Services.
  • Strong analysis and problem-solving skills
  • Strong communication skills, both verbal and written
  • Ability to work independently with limited supervision
  • Proven ability to use own initiative to resolve issues
  • Full ownership of projects and tasks
  • Ability and willingness to work under pressure, on multiple concurrent tasks, and to deliver to agreed deadlines
  • Eagerness to learn
  • Strong team-working skills
Read more
Zycus

at Zycus

10 recruiters
Nafis Kurne
Posted by Nafis Kurne
Pune, Mumbai, Bangalore
14 - 26 yrs
₹25L - ₹55L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+18 more

EXPERTISE AND QUALIFICATIONS

  • 14+ years of experience in Software Engineering with at least 6+ years as a Lead Enterprise Architect preferably in a software product company
  • High technical credibility - ability to lead technical brainstorming, take decisions and push for the best solution to a problem
  • Experience in architecting Microservices based E2E Enterprise Applications
  • Experience in UI technologies such as Angular, Node.js or Fullstack technology is desirable
  • Experience with NoSQL technologies (MongoDB, Neo4j etc.)
  • Elastic Search, Kibana, ELK, Logstash.
  • Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
  • Exposure in SaaS cloud-based platform.
  • Experience on Docker, Kubernetes etc.
  • Experience in planning, designing, developing and delivering Enterprise Software using Agile Methodology
  • Key Programming Skills: Java, J2EE with cutting edge technologies
  • Hands-on technical leadership with proven ability to recruit and mentor high performance talents including Architects, Technical Leads, Developers
  • Excellent team building, mentoring and coaching skills are a must-have
  • A proven track record of consistently setting and achieving high standards

Five Reasons Why You Should Join Zycus

1. Cloud Product Company: We are a Cloud SaaS Company, and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React.

2. A Market Leader: Zycus is recognized by Gartner (world’s leading market research analyst) as a Leader in Procurement Software Suites.

3. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization

4. Get a Global Exposure: You get to work and deal with our global customers.

5. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.


About Us

Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users.


Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-

to-use user interface ensures high adoption and value across the organization.


Start your #CognitiveProcurement journey with us, as you are #MeantforMore

Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Windows Azure
+3 more
  1. Role: IoT Application Development (Java) Skill Set:
  • Proficiency in Java 11.
  • Strong knowledge of Spring Boot framework.
  • Experience with Kubernetes.
  • Familiarity with Kafka.
  • Understanding of Azure Cloud services.

1 Experience: 3 to 5 years Location: Bangalore ; Notice period : Immediate Joiners

  1. Job Description: We are seeking an experienced IoT Application Developer with expertise in Java to join our team in Bangalore. As a Java Developer, you will be responsible for designing, developing, and deploying IoT applications. You should have a solid understanding of Java 11 and the Spring Boot framework. Experience with Kubernetes and Kafka is also required. Familiarity with Azure Cloud services is essential. Your role will involve collaborating with the development team to build scalable and efficient IoT solutions using Java and related technologies.


Read more
Blue Yonder

at Blue Yonder

5 recruiters
GnanaMalleshwar Karri
Posted by GnanaMalleshwar Karri
Hyderabad, Bengaluru (Bangalore)
10 - 12 yrs
₹10L - ₹30L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
skill iconMongoDB
+8 more

About Merchandise Operation (Merch Ops): Merchandise Operations (Merch Ops) is a merchandise management system, it is positioned as a host system in the retail solutions, it has ability to maintain the Master/Foundation data, create and manage Purchase Orders, create, and manage Prices & Promotions, perform Replenishment, effective inventory control and financial management. Merc Ops provides Business users with consistent, accurate, and timely data across an enterprise by allowing them to get the:


Right Goods in the...

• Right Silhouettes, Sizes and Colors; at the...

• Right Price; at the...

• Right Location; for the...

• Right Consumer; at the...

• Right Time; at the...

• Right Quantity.


About Team:

• Proven, passionate bunch of disruptors providing solutions that solve real-time supply chain problems.

• Well mixed experienced team with young members and experienced in product, domain, and Industry knowledge.

• Gained Expertise in designing and deploying massively scalable cloud native SaaS products

• The team currently comprises of associates across the globe and is expected to grow rapidly.


Our current technical environment:

• Software: React JS, Node JS, Oracle PL/SQL, GIT, Rest API. Java script.

• Application Architecture: Scalable three tier web application.

• Cloud Architecture: Private cloud, MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD) • Frameworks/Others: Tomcat Apache, RDBMS, Jenkins, Nginx, Oracle Type ORM, Express.


What you'll be doing:

• As a Staff Engineer you will be responsible for the design of the features in the product roadmap

• Creating and encouraging good software development practices engineering-wide, driving strategic technical improvements, and mentoring other engineers.

• You will write code as we expect our technical leadership to be in the trenches alongside junior engineers, understanding root causes and leading by example

• You will mentor engineers

• You will own relationships with other engineering teams and collaborate with other functions within Blue Yonder

• Drive architecture and designs to become simpler, more robust, and more efficient.


• Lead designs discussion and come up with robust and more efficient designs to achieve features in product roadmap

• Take complete responsibility of the features developed right from coding till deployments

• Introduce new technology and tools for the betterment of the product

• Guides fellow engineers to look beyond the surface and fix the root causes rather than symptoms.


What we are looking for:

• Bachelor’s degree (B.E/B.Tech/M.Tech Computer science or related specialization) and minimum 7 to 10 years of experience in Software development, has been an Architect, within the last 1-2 years minimum. • Strong programming experience and background in Node JS and React JS.

• Hands-on development skills along with architecture/design experience.

• Hands-on experience on designing, building deploying and maintenance of enterprise cloud solutions.

• Demonstrable experience, thorough knowledge, and interests in Cloud native architecture, Distributed micro-services, Multi-tenant SaaS solution and Cloud Scalability, performance, and High availability

• Experience with API management platforms & providing / consuming RESTful APIs

• Experience with varied tools such as REST, Hibernate, RDBMS, Docker, Kubernetes, Kafka, React.

• Hands-on development experience on Oracle PL/SQL.

• Experience with DevOps and infrastructure automation.

Read more
Recro

at Recro

1 video
32 recruiters
Amrita Singh
Posted by Amrita Singh
Bengaluru (Bangalore)
2 - 6 yrs
₹5L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Microservices
+5 more
  • 2.5+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 3+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.
Read more
BlueYonder
Bengaluru (Bangalore), Hyderabad
10 - 14 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Gradle
+13 more

·      Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.

·      BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.

·      This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.

·      Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment

·      The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools

Our current technical environment:

·      Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake

·      • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture

·      • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)

·      Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite

Read more
Conviva

at Conviva

1 recruiter
Adarsh Sikarwar
Posted by Adarsh Sikarwar
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹40L / yr
Apache Kafka
skill iconRedis
Systems design
Data Structures
Algorithms
+5 more

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses. 


Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.

 

What you get to do in this role:

Work on extremely high scale RUST web services or backend systems.

Design and develop solutions for highly scalable web and backend systems.

Proactively identify and solve performance issues.

Maintain a high bar on code quality and unit testing.

 

What you bring to the role:

5+ years of hands-on software development experience.

At least 2+ years of RUST development experience.

Knowledge of cargo packages for kafka, redis etc.

Strong CS fundamentals, including system design, data structures and algorithms.

Expertise in backend and web services development.

Good analytical and troubleshooting skills.

 

What will help you stand out:

Experience working with large scale web services and applications.

Exposure to Golang, Scala or Java

Exposure to Big data systems like Kafka, Spark, Hadoop etc.

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.  


Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 

Read more
Conviva

at Conviva

1 recruiter
Anusha Bondada
Posted by Anusha Bondada
Bengaluru (Bangalore)
3 - 6 yrs
₹20L - ₹40L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

As Conviva is expanding, we are building products providing deep insights into end-user experience for our customers.

 

Platform and TLB Team

The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real-time. Engineer the next-gen Spark-like system for in-memory computation of large time-series datasets – both Spark-like backend infra and library-based programming model. Build a horizontally and vertically scalable system that analyses trillions of events per day within sub-second latencies. Utilize the latest and greatest big data technologies to build solutions for use cases across multiple verticals. Lead technology innovation and advancement that will have a big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.

 

What You’ll Do

This is an individual contributor position. Expectations will be on the below lines:

  • Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva’s products
  • Responsible for the architecture of the Conviva platform
  • Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
  • Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements, etc.
  • Lead a team to develop a feature or parts of a product
  • Adhere to the Agile model of software development to plan, estimate, and ship per business priority

 

What you need to succeed

  • 5+ years of work experience in software development of data processing products.
  • Engineering degree in software or equivalent from a premier institute.
  • Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
  • Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
  • Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
  • Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
  • Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
  • Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.  

Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 


Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore), Delhi, Gurugram, Noida
3 - 8 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Spring MVC
+5 more

Required Skills:


  • 3+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.


Read more
Conviva

at Conviva

1 recruiter
Anusha Bondada
Posted by Anusha Bondada
Bengaluru (Bangalore)
3 - 15 yrs
₹25L - ₹70L / yr
skill iconScala
Akka
Algorithms
Data Structures
Functional programming
+6 more

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses. 

 

Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.

 

As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.

 

Platform and TLB Team

The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.

 

What You’ll Do

This is an individual contributor position. Expectations will be on the below lines:

  • Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
  • Responsible for the architecture of the Conviva platform
  • Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
  • Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
  • Lead a team to develop a feature or parts of the product
  • Adhere to the Agile model of software development to plan, estimate, and ship per business priority

 

What you need to succeed

  • 9+ years of work experience in software development of data processing products.
  • Engineering degree in software or equivalent from a premier institute.
  • Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
  • Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
  • Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
  • Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
  • Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
  • Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity

 

Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.  

Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets! 



Read more
Accolite Digital
Nitesh Parab
Posted by Nitesh Parab
Bengaluru (Bangalore), Hyderabad, Gurugram, Delhi, Noida, Ghaziabad, Faridabad
4 - 8 yrs
₹5L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SSIS
SQL Server Integration Services (SSIS)
+10 more

Job Title: Data Engineer

Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.

Responsibilities:

  • Design, build, and maintain data pipelines to collect, store, and process data from various sources.
  • Create and manage data warehousing and data lake solutions.
  • Develop and maintain data processing and data integration tools.
  • Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
  • Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
  • Ensure data quality and integrity across all data sources.
  • Develop and implement best practices for data governance, security, and privacy.
  • Monitor data pipeline performance / Errors and troubleshoot issues as needed.
  • Stay up-to-date with emerging data technologies and best practices.

Requirements:

Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience with ETL tools like Matillion,SSIS,Informatica

Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.

Experience in writing complex SQL queries

Strong programming skills in languages such as Python, Java, or Scala.

Experience with data modeling, data warehousing, and data integration.

Strong problem-solving skills and ability to work independently.

Excellent communication and collaboration skills.

Familiarity with big data technologies such as Hadoop, Spark, or Kafka.

Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks

Familiarity with cloud computing platforms such as AWS, Azure, or GCP.

Familiarity with Reporting tools

Teamwork/ growth contribution

  • Helping the team in taking the Interviews and identifying right candidates
  • Adhering to timelines
  • Intime status communication and upfront communication of any risks
  • Tech, train, share knowledge with peers.
  • Good Communication skills
  • Proven abilities to take initiative and be innovative
  • Analytical mind with a problem-solving aptitude

Good to have :

Master's degree in Computer Science, Information Systems, or a related field.

Experience with NoSQL databases such as MongoDB or Cassandra.

Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.

Knowledge of machine learning and statistical modeling techniques.

If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.

Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore), Delhi, Gurugram, Noida
3 - 7 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Spring MVC
+5 more
  • 3+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.


Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore), Delhi, Gurugram, Noida
3 - 7 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Spring MVC
+4 more

Required Education:


B.Tech./ BE - Computer, IT, Electronics only

Required Skills:


  • 2+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus


Read more
Remote, Hyderabad, Bengaluru (Bangalore)
8 - 15 yrs
₹20L - ₹55L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconData Analytics
skill iconData Science
+11 more

CTC Budget: 35-55LPA

Location: Hyderabad (Remote after 3 months WFO)


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


  • 6 plus years of experience as a Python developer.
  • Experience in web development using Python and Django Framework.
  • Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
  • Experience in developing User Interface using HTML, JavaScript, CSS.
  • Experience in server-side templating languages including Jinja 2 and Mako
  • Knowledge into Kafka and RabitMQ (GTH)
  • Experience into Docker, Git and AWS
  • Ability to integrate multiple data sources into a single system.
  • Ability to collaborate on projects and work independently when required.
  • DB (MySQL, Postgress, SQL)


Selection Process: 2-3 Interview rounds (Tech, VP, Client)

Read more
Kapture CX

at Kapture CX

1 video
Deepika Dhanraj
Posted by Deepika Dhanraj
Bengaluru (Bangalore)
2 - 7 yrs
₹8L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Apache Kafka
+7 more

Kapture CRM is an enterprise-focused Service automation SaaS platform. We help 500+ enterprises in 14 countries to manage their customer service in a more intelligent, contextual way.


Roles & Responsibilities :


    * Proven experience in Java8, Spring Boot, Microservices/API

    * Strong experience with Kafka, Kubernetes

    * Strong experience in using RDBMS (Mysql) and NoSQL.

    * Experience in working in Eclipse / Maven environments.

    * Hands-on experience in Unix / Shell scripting.

    * Hands-on experience in fine-tuning application response/performance testing.

    * Experience in Web Services.

    * Strong analysis & problem-solving skills

    * Strong communication skills - both verbal and written

    * Ability to work independently with limited supervision

    * Proven ability to use own initiative to resolve issues

    * Full ownership of projects/tasks

    * Ability and willingness to work under pressure, on multiple concurrent tasks, and to deliver to agreed deadlines

    * Eagerness to learn

    * Strong team-working skills

Read more
Hyderabad, Bengaluru (Bangalore)
8 - 12 yrs
₹30L - ₹50L / yr
skill iconPHP
skill iconJavascript
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
+17 more

CTC Budget: 35-50LPA

Location: Hyderabad/Bangalore

Experience: 8+ Years


Company Overview:


An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.


Work with, learn from, and contribute to a diverse, collaborative

development team

● Use plenty of PHP, Go, JavaScript, MySQL, PostgreSQL, ElasticSearch,

Redshift, AWS Services and other technologies

● Build efficient and reusable abstractions and systems

● Create robust cloud-based systems used by students globally at scale

● Experiment with cutting edge technologies and contribute to the

company’s product roadmap


● Deliver data at scale to bring value to clients Requirements


You will need:

● Experience working with a server side language in a full-stack environment

● Experience with various database technologies (relational, nosql,

document-oriented, etc) and query concepts in high performance

environments

● Experience in one of these areas: React, Backbone

● Understanding of ETL concepts and processes

● Great knowledge of design patterns and back end architecture best

practices

● Sound knowledge of Front End basics like JavaScript, HTML, CSS

● Experience with TDD, automated testing

● 12+ years’ experience as a developer

● Experience with Git or Mercurial

● Fluent written & spoken English

It would be great if you have:

● B.Sc or M.Sc degree in Software Engineering, Computer Science or similar

● Experience and/or interest in API Design

● Experience with Symfony and/or Doctrine

● Experience with Go and Microservices

● Experience with message queues e.g. SQS, Kafka, Kinesis, RabbitMQ

● Experience working with a modern Big Data stack

● Contributed to open source projects

● Experience working in an Agile environment

Read more
Recro

at Recro

1 video
32 recruiters
SD S
Posted by SD S
Bengaluru (Bangalore)
3 - 7 yrs
₹7L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Spring
+19 more

Job Summary:


We are looking for a skilled and experienced Java Developer to join our team. As a Java Developer, you will be responsible for developing and maintaining our applications using Java, Spring framework, and other related technologies. The ideal candidate should have a strong understanding of object-oriented programming principles, as well as experience with a variety of technologies such as SQL, NoSQL, and cloud computing.


Responsibilities:


  • Design, develop, and maintain our applications using Java, Spring framework, and other related technologies
  • Write clean, efficient, and optimized code for applications
  • Collaborate with cross-functional teams to understand user requirements and deliver high-quality solutions
  • Develop and maintain backend systems using Spring framework
  • Work with databases, including SQL and NoSQL
  • Ensure code quality and maintain documentation
  • Troubleshoot and debug applications
  • Stay updated with emerging trends and technologies in Java development
  • Work with other teams to deploy and maintain applications


Requirements:


  • 3-7 years of experience in Java development
  • Strong understanding of object-oriented programming principles
  • Experience with Java, Spring framework, and related technologies
  • Familiarity with databases, including SQL and NoSQL
  • Knowledge of cloud computing is a plus
  • Excellent problem-solving and debugging skills
  • Strong communication and collaboration skills
  • Ability to work independently and as part of a team
  • Bachelor's degree in computer science or a related field


Key Skills:


  • Strong proficiency in Java programming language
  • Experience with Spring framework, including Spring Boot and Spring MVC
  • Familiarity with cloud platforms such as AWS, GCP, and Azure
  • Experience building RESTful APIs
  • Knowledge of microservices architecture
  • Familiarity with SQL and relational databases such as MySQL and Postgres
  • Familiarity with NoSQL databases such as MongoDB and Redis
  • Experience with messaging systems such as Kafka and RabbitMQ
  • Experience with containerization tools such as Docker and Kubernetes
  • Understanding of software development principles and experience with SDLC methodologies
  • Experience with Git version control and build tools such as Maven and Gradle
  • Familiarity with front-end technologies such as Angular and React is a plus
  • Strong problem-solving and analytical skills
  • Good communication and interpersonal skills
  • Ability to work independently and take ownership of tasks
  • Experience with test-driven development and unit testing frameworks such as JUnit and Mockito
  • Familiarity with CI/CD tools such as Jenkins is a plus
  • Familiarity with caching technologies such as Redis is a plus
  • Working knowledge of design patterns and software architecture principles is a plus.


Read more
Recro

at Recro

1 video
32 recruiters
Mohit Arora
Posted by Mohit Arora
Bengaluru (Bangalore)
2 - 7 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconAmazon Web Services (AWS)
+6 more
  • 2+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.


Read more
Gipfel & Schnell Consultings Pvt Ltd
Aravind Kumar
Posted by Aravind Kumar
Bengaluru (Bangalore)
3 - 8 yrs
Best in industry
Software Testing (QA)
Test Automation (QA)
Appium
Selenium
skill iconJava
+11 more

Minimum 4 to 10 years of experience in testing distributed backend software architectures/systems.

• 4+ years of work experience in test planning and automation of enterprise software

• Expertise in programming using Java or Python and other scripting languages.

• Experience with one or more public clouds is expected.

• Comfortable with build processes, CI processes, and managing QA Environments as well as working with build management tools like Git, and Jenkins

. • Experience with performance and scalability testing tools.

• Good working knowledge of relational databases, logging, and monitoring frameworks is expected.

Familiarity with system flow like how they interact with an application Eg. Elasticsearch, Mongo, Kafka, Hive, Redis, AWS

Read more
Gipfel & Schnell Consultings Pvt Ltd
Aravind Kumar
Posted by Aravind Kumar
Bengaluru (Bangalore)
5 - 12 yrs
₹20L - ₹33L / yr
skill iconJava
skill iconJavascript
skill iconReact.js
skill iconAngular (2+)
skill iconAngularJS (1.x)
+16 more

Essential Responsibilities:

 

  • Demonstrate an understanding of the Agile software development life cycle and distinguish the core inputs and outputs in each cycle.
  • Work closely with your peers and keep engaging in a fast pace technical design and development team
  • Execute in a fast pace delivery mode and focus on delivering tasks to meet monthly and quarterly digital product release goals
  • Lead impact assessment and decisions related to technology choices, design / architectural considerations and implementation strategy
  • Maintain code quality through best practices, unit testing and code quality automation
  • Demonstrate the ability to make informed technology choices after due diligence and impact assessment
  • Help in designing interfaces and information exchange between modules
  • Articulate the need for scalability and understand the importance of improving quality through testing.
  • Be an expert in writing code that meets standards and delivers the desired functionality using the technology selected for the project
  • Drive design reviews, define interfaces between code modules, and apply existing technology to designs
  • Be an expert in assessing application performance and optimizing/improving it through design and best coding practices


Qualifications/Requirements:

 

  • Minimum Bachelor's Degree in Computer Science, Computer Engineering or in "STEM" Majors (Science, Technology, Engineering, and Math)
  • 6+ years of experience in Full Stack Software Development within the enterprise or software services domain


Desired Skills:

 

  • Expertise in full stack software development and awareness of 12 Factor software patterns
  • Experience and knowledge of patterns and anti-patterns of microservices-based architecture design
  • Experience developing and deploying applications on cloud (Azure, AWS, or GCP), on-premise, and hybrid-based architectures
  • Mid-Level to Expert within one or more of the following UI development JavaScript: Client-Side HTML5 jQuery, jQuery UI, Knockout.js,
  • Polymer, AngularJS, ReactJS, Bootstrap
  • Mid-Level to Expert within one or more of the back-end development languages: .NET, Java, Python, or Scala
  • Very solid API skills (e.g. Express.js/Node.js, GraphQL/Relay, Flask, Jersey, Java Spring REST or WebApi2)
  • Skilled in use of Java, Kafka, and Spark streaming technologies
  • Experience with containerization technologies such as Rancher, Kubernetes, Docker and Helm
  • Hands-on experience in data storage environments of many types (RDMS, NoSQL, HDFS, etc.)
  • Knowledge of GitLab, Jenkins and Artifactory
  • Solid foundation in data structures, algorithms, and OO Design with rock-solid programming skills
  • Security: Identity Management and Access, application security and static code analysis
  • Proven success working in and promoting a rapidly changing, collaborative, and iterative product development environment
  • Strong interpersonal skills, analytical skills, combined with intellectual curiosity, and a desire and ability to "get things done" are essential
  • Agile Scrum development experience
  • Added advantage to those having experience in multi-tenant SaaS Platform and Developers' Portal development
Read more
Zycus

at Zycus

10 recruiters
Viren Bhuptani
Posted by Viren Bhuptani
Mumbai, Pune, Bengaluru (Bangalore)
15 - 25 yrs
Best in industry
Microservices
J2EE
skill iconSpring Boot
skill iconJava
Hibernate (Java)
+3 more

We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions.

  • As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team.
  • You will have to collaborate with Product Management and Implementation teams and build a commercially successful product.
  • You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands on engineering leadership.
  • Requirement deep technical knowledge in Software Product Engineering using Amazon Web Services,Java 8 Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, docker, kubernetes, Amazon Web Services ,Architecture Concepts,Design PatternsData Structures & Algorithms,Distributed Computing,Multi-threading,AWS,Docker,Kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot, etc. is a must.



  • 16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company.
  • Hands-on technical leadership with proven ability to recruit high performance talent
  • High technical credibility - ability to audit technical decisions and push for the best solution to a problem.
  • Experience building E2E Application right from backend database to persistent layer.
  • Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred.
  • Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.)
  • Elastic Search, Kibana, ELK, Logstash.
  • Experience in developing Enterprise Software using Agile Methodology.
  • Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
  • SaaS cloud-based platform exposure.
  • Experience on Docker, Kubernetes etc.
  • Ownership E2E design development and also quality enterprise product/application deliverable exposure
  • A track record of setting and achieving high standards
  • Strong understanding of modern technology architecture
  • Key Programming Skills: Java, J2EE with cutting edge technologies
  • Excellent team building, mentoring and coaching skills are a must-have


Read more
Recro

at Recro

1 video
32 recruiters
Mounashree JP
Posted by Mounashree JP
Remote, Bengaluru (Bangalore), Delhi, Noida
2 - 6 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Data Structures
+6 more

Requirements

  • 2+ years of experience in the Development of JAVA technology.
  • Strong Java Basics
  • Linux
  • SpringBoot or Spring MVC
  • Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Java 8
  • Any Caching Mechanism
  • Good at problem-solving


Good to Have Skills:

  • 2+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem-solving skills.
  • Ability to work in a fast-paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding AI/ML algorithms is a plus.


Read more
Recro

at Recro

1 video
32 recruiters
Amrita Singh
Posted by Amrita Singh
Bengaluru (Bangalore), Pune, Noida
3 - 6 yrs
₹6L - ₹20L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Microservices
+4 more

Requirements

  • 3+ years of experience in the Development of JAVA technology.
  • Strong Java Basics
  • Linux
  • SpringBoot or Spring MVC
  • Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Java 8
  • Any Caching Mechanism
  • Good at problem-solving


Good to Have Skills:

  • 3 years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem-solving skills.
  • Ability to work in a fast-paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding AI/ML algorithms is a plus.


Read more
Telstra

at Telstra

1 video
1 recruiter
Mahesh Balappa
Posted by Mahesh Balappa
Bengaluru (Bangalore), Hyderabad, Pune
3 - 7 yrs
Best in industry
Spark
Hadoop
NOSQL Databases
Apache Kafka

About Telstra

 

Telstra is Australia’s leading telecommunications and technology company, with operations in more than 20 countries, including In India where we’re building a new Innovation and Capability Centre (ICC) in Bangalore.

 

We’re growing, fast, and for you that means many exciting opportunities to develop your career at Telstra. Join us on this exciting journey, and together, we’ll reimagine the future.

 

Why Telstra?

 

  • We're an iconic Australian company with a rich heritage that's been built over 100 years. Telstra is Australia's leading Telecommunications and Technology Company. We've been operating internationally for more than 70 years.
  • International presence spanning over 20 countries.
  • We are one of the 20 largest telecommunications providers globally
  • At Telstra, the work is complex and stimulating, but with that comes a great sense of achievement. We are shaping the tomorrow's modes of communication with our innovation driven teams.

 

Telstra offers an opportunity to make a difference to lives of millions of people by providing the choice of flexibility in work and a rewarding career that you will be proud of!

 

About the team

Being part of Networks & IT means you'll be part of a team that focuses on extending our network superiority to enable the continued execution of our digital strategy.

With us, you'll be working with world-leading technology and change the way we do IT to ensure business needs drive priorities, accelerating our digitisation programme.

 

Focus of the role

Any new engineer who comes into data chapter would be mostly into developing reusable data processing and storage frameworks that can be used across data platform.

 

About you

To be successful in the role, you'll bring skills and experience in:-

 

Essential 

  • Hands-on experience in Spark Core, Spark SQL, SQL/Hive/Impala, Git/SVN/Any other VCS and Data warehousing
  • Skilled in the Hadoop Ecosystem(HDP/Cloudera/MapR/EMR etc)
  • Azure data factory/Airflow/control-M/Luigi
  • PL/SQL
  • Exposure to NOSQL(Hbase/Cassandra/GraphDB(Neo4J)/MongoDB)
  • File formats (Parquet/ORC/AVRO/Delta/Hudi etc.)
  • Kafka/Kinesis/Eventhub

 

Highly Desirable

Experience and knowledgeable on the following:

  • Spark Streaming
  • Cloud exposure (Azure/AWS/GCP)
  • Azure data offerings - ADF, ADLS2, Azure Databricks, Azure Synapse, Eventhubs, CosmosDB etc.
  • Presto/Athena
  • Azure DevOps
  • Jenkins/ Bamboo/Any similar build tools
  • Power BI
  • Prior experience in building or working in team building reusable frameworks,
  • Data modelling.
  • Data Architecture and design principles. (Delta/Kappa/Lambda architecture)
  • Exposure to CI/CD
  • Code Quality - Static and Dynamic code scans
  • Agile SDLC      

 

If you've got a passion to innovate, succeed as part of a great team, and looking for the next step in your career, we'd welcome you to apply!

___________________________

 

We’re committed to building a diverse and inclusive workforce in all its forms. We encourage applicants from diverse gender, cultural and linguistic backgrounds and applicants who may be living with a disability. We also offer flexibility in all our roles, to ensure everyone can participate.

To learn more about how we support our people, including accessibility adjustments we can provide you through the recruitment process, visit tel.st/thrive.

Read more
Kloud9 Technologies
manjula komala
Posted by manjula komala
Bengaluru (Bangalore)
3 - 6 yrs
₹18L - ₹27L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

About Kloud9:


Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.


Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.


At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.


Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.


We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.



What we are looking for:


●       3+ years’ experience developing Big Data & Analytic solutions

●       Experience building data lake solutions leveraging Google Data Products (e.g. Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.), Hive, Spark

●       Experience with relational SQL/No SQL

●       Experience with Spark (Scala/Python/Java) and Kafka

●       Work experience with using Databricks (Data Engineering and Delta Lake components)

●       Experience with source control tools such as GitHub and related dev process

●       Experience with workflow scheduling tools such as Airflow

●       In-depth knowledge of any scalable cloud vendor(GCP preferred)

●       Has a passion for data solutions

●       Strong understanding of data structures and algorithms

●       Strong understanding of solution and technical design

●       Has a strong problem solving and analytical mindset

●       Experience working with Agile Teams.

●       Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

●       Able to quickly pick up new programming languages, technologies, and frameworks

●       Bachelor’s Degree in computer science


Why Explore a Career at Kloud9:


With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers!

Read more
Recro

at Recro

1 video
32 recruiters
Shifat S
Posted by Shifat S
Bengaluru (Bangalore), Noida, Mumbai
4 - 7 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Spring MVC
+4 more

Required Skills:


  • 3+ years of experience in the Development of JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem-solving


Good to Have Skills:


  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem-solving skills.
  • Ability to work in a fast-paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding AI/ML algorithms is a plus.


Read more
Merge

at Merge

2 candid answers
Sweta Aneja
Posted by Sweta Aneja
Bengaluru (Bangalore)
2 - 5 yrs
₹10L - ₹40L / yr
Distributed Systems
Microservices
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
skill iconRuby
+6 more

What are we looking for?

We are looking for hands-on coders who love what they do, have high attention to detail and are looking for challenging opportunities which involve building products from scratch. Someone who is proactive, and always keen to learn.

 

What will you be doing?

On a daily basis, some of your work will involve but is not limited to:

  • Write clean, secure, and well-tested code
  • Build tools and integrate systems to scale the effectiveness of the product

Work Culture at Merge:

Commitment to excellence - In every output, we produce as individuals and as a company, we have to strive for world-class quality. We’re making a change in the world. It will push us out of our comfort zones. We operate in a rapidly changing market and strive to deliver high-quality products faster than anyone else.

We get it done - We take ownership of what we do. Working here is about really, truly owning everything you do. There’s no such thing as “Not my job.” If you see a problem that needs solving, you can – and should – be the one to solve it

Requirements

Skills That Will Help You Excel At Merge

  • You have 2 to 5 years of experience building highly reliable and scalable backend systems
  • Experience in two or more languages. Go, Node.js, Python, or Java would be ideal.
  • You have experience in high-throughput distributed systems and microservices
  • Experience with AWS and CI/CD workflow
  • Driven, and passionate about building products
  • You take ownership of your code
  • You have good communication skills in English
  • You are familiar with security best practices
Read more
Recro

at Recro

1 video
32 recruiters
Sahana gowda
Posted by Sahana gowda
Bengaluru (Bangalore)
3 - 5 yrs
₹5L - ₹18L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Spring MVC
+6 more
B.Tech./ BE - Computer, IT, Electronics only
Required Skills:
 
  • 2+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving
 
Good to Have Skills:
 
  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.
Read more
Inviz Ai Solutions Private Limited
Shridhar Nayak
Posted by Shridhar Nayak
Bengaluru (Bangalore)
4 - 8 yrs
Best in industry
Spark
Hadoop
Big Data
Data engineering
PySpark
+8 more

InViz is Bangalore Based Startup helping Enterprises simplifying the Search and Discovery experiences for both their end customers as well as their internal users. We use state-of-the-art technologies in Computer Vision, Natural Language Processing, Text Mining, and other ML techniques to extract information/concepts from data of different formats- text, images, videos and make them easily discoverable through simple human-friendly touchpoints. 

 

TSDE - Data 

Data Engineer:

 

  • Should have total 3-6 Yrs of experience in Data Engineering.
  • Person should have experience in coding data pipeline on GCP. 
  • Prior experience on Hadoop systems is ideal as candidate may not have total GCP experience. 
  • Strong on programming languages like Scala, Python, Java. 
  • Good understanding of various data storage formats and it’s advantages. 
  • Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources). 
  • Should have Business mindset to understand data and how it will be used for BI and Analytics purposes. 
  • Data Engineer Certification preferred 

 

Experience in Working with GCP tools like

 
 

Store :  CloudSQL , Cloud Storage, Cloud Bigtable,  Bigquery, Cloud Spanner, Cloud DataStore

 

Ingest :  Stackdriver, Pub/Sub, AppEngine, Kubernete Engine, Kafka, DataPrep , Micro services

 

Schedule : Cloud Composer

 

Processing: Cloud Dataproc, Cloud Dataflow, Cloud Dataprep

 

CI/CD - Bitbucket+Jenkinjs / Gitlab

 

Atlassian Suite

 

 

 

 .
Read more
Accion Labs

at Accion Labs

14 recruiters
Jayasri Palanivelu
Posted by Jayasri Palanivelu
Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
6 - 10 yrs
₹15L - ₹30L / yr
skill iconJava
skill iconSpring Boot
Hibernate (Java)
Microservices
NOSQL Databases
+3 more

Desired Candidate Profile


  • A team focus with strong collaboration and communication skills
  • Exceptional ability to quickly grasp high-level business goals, derive requirements, and translate them into effective technical solutions
  • Exceptional object-oriented thinking, design and programming skills (Java 8 or 11)
  • Expertise with the following technologies : Data Structures, Design Patterns ,Code Versioning Tools(Github/bitbucket/..), XML, JSON, Spring Batch Restful, Spring Cloud, Grafana(Knowledge/Experience), Kafka, Spring Boot, Microservices, DB/NoSQL, Docker, Kubernetes, AWS/GCP, Architecture design (Patterns) Agile, JIRA.
  • Penchant toward self-motivation and continuous improvement; these words should describe you: dedicated, energetic, curious, conscientious, and flexible.
Read more
SmartCoin

at SmartCoin

1 recruiter
Suchoritha Chatterjee
Posted by Suchoritha Chatterjee
Bengaluru (Bangalore)
3 - 5 yrs
₹18L - ₹40L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Ebean
+5 more
Responsibilities and Skills:
• Knowledge of Agile methodologies & best practices for the SDLC (including coding standards, code
reviews, source control management & build processes).
• Must have experience in designing Factories/API/Interfaces independently in Java
• Highly skilled in using OR Tools like Hibernate/Ebean.
• Must have a good understanding of relational databases (MySQL/Postgres), transactions, and indexing.
• Must be able to do performance optimization, and use multi-threading wherever possible.
• Experience with Kafka, Big Query, and ElasticSearch a plus.
• Drive test coverage and continuous delivery automation within the team.
• Experience with building highly available and scalable distributed systems a plus.
Preferred Qualifications:
• 3-4 Years of experience with a Bachelor’s/Master's degree in Computer Science, Math, or Related
technical domain from reputed organizations.
• Strong communication & mentoring skills
Read more
Paytm

at Paytm

41 recruiters
Anuj Kanojia
Posted by Anuj Kanojia
Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Mumbai, Bengaluru (Bangalore), Pune
9 - 15 yrs
Best in industry
J2EE
skill iconSpring Boot
skill iconJava
Microservices
Apache Kafka
About Us: 
Paytm is India’s leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks’ financial instruments. To further enhance merchants’ business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners.  
 
About the role:
As a Principal Engineer, you will help define the technical design and implementation roadmap across multiple solutions and will work with engineering leadership to ensure we resource and equip our squads with the right expertise to deliver those solutions.  
 
Requirements: 
10 to 14 years of strong software design/development experience in building massively large-scale distributed internet systems and products
Hands-on experience in Advance Java, Spring boot, AWS, Node
Experience and knowledge of open source tools & frameworks, broader cutting edge technologies around server-side development
Should be an active contributor to developer communities like Stack Overflow, Top coder, Git Hub, and Google Developer Groups (GDGs). 
Superior organization, communication, interpersonal and leadership skills.
Must be a self-starter who can work well with minimal guidance and in a fluid environment. 
 
Preferred Qualifications: Bachelor's/Master's Degree in Computer Science or equivalent 
 
Skills that will help you succeed in this role: 
Expertise in Java, DB: RDBMS, Messaging: Kafka/RabbitMQ, Caching: Redis/Aerospike, Microservices, AWS
Strong experience in scaling, performance tuning & optimization at both API and storage layers
Problem Solver with a passion for excellence.  
 
Why join us:
Because you get an opportunity to make a difference, and have a great time doing that
You are challenged and encouraged her to do stuff that is meaningful for you and for those we serve
You should work with us if you think seriously about what technology can do for people
We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. 
Learn more about the exciting work we do in Tech by reading our https://paytm.com/blog/engineering/">Engineering blogs
 
Compensation:
If you are the right fit, we believe in creating wealth for you.
With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story! 
Read more
pricing of digital content
Agency job
via Qrata by Rayal Rajan
Bengaluru (Bangalore)
1 - 5 yrs
₹9L - ₹35L / yr
skill iconRust
SQL
NOSQL Databases
skill iconJavascript
skill iconElastic Search
+3 more

We are looking for a Rust Developer to join our cutting-edge development team as it grows. The candidate must be comfortable working in an agile environment and can take the lead when necessary.

 

Responsibilities:

  • Responsible for developing the product as per the product specification defined by the product managers
  • Responsible for performing research on the best methods of implementing the requirements
  • Author and curate technical documentation to support delivery, maintenance, and adoption
  • Work with programmers, engineers, and management heads to identify process improvement opportunities, propose system modifications, and devise governance strategies to optimise the overall performance
  • Design and develop automated deployment and maintenance mechanisms
  • Solving development challenges and making architectural decisions by understanding the larger picture of the project goals
  • Expanding your existing skill set, and picking up on various rust-dependent frameworks
  • Confidently communicating and collaborating with your fellow developers in an office set-up.

 

Requirements:

  • Must have experience in Rust programming language
  • Have excellent knowledge of different data structures and algorithms
  • Working knowledge of any other programming language Python, Java, or JavaScript is good to have
  • Experience with, or understanding of, Kafka or Redis, Cloud infrastructure services, and Docker is an added advantage
  • Experience with SQL or NoSQL databases, MySQL, MongoDB, Elasticsearch, etc.
  • Experience in Backend and APIs development
  • Experience in analysing and optimising the platform's performance
Read more
SmartCoin

at SmartCoin

1 recruiter
Suchoritha Chatterjee
Posted by Suchoritha Chatterjee
Bengaluru (Bangalore)
6 - 9 yrs
₹24L - ₹40L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconRedis
+1 more
Roles & Responsibilities:
• Determining project requirements and developing work schedules for the team.
• Delegating tasks and achieving daily, weekly, and monthly goals.
• Liaising with team members, management, and clients to ensure projects are completed
to standard.
• Identifying risks and forming contingency plans as soon as possible.
• Analyzing existing operations and scheduling training sessions and meetings to discuss
improvements.
• Keeping up-to-date with industry trends and developments.
• Updating work schedules and performing troubleshooting as required.
• Motivating staff and creating a space where they can ask questions and voice their
concerns.
• Being transparent with the team about challenges, failures, and successes.
• Writing progress reports and delivering presentations to the relevant stakeholders.
Technical Lead Requirements:
• Bachelor’s degree in computer science, engineering, or a related field.
• Relevant Management certification may be required.
• Experience in a similar role would be advantageous.
• Excellent technical, diagnostic, and troubleshooting skills.
• Strong leadership and organizational abilities.
• Willingness to build professional relationships with staff and clients.
• Excellent communication, motivational, and interpersonal skills
Read more
PL

at PL

Agency job
Navi Mumbai, Bengaluru (Bangalore), Pune
4 - 10 yrs
₹1L - ₹15L / yr
Apache Kafka
Kafka
skill iconJava
skill iconPython
  • 3-8+ years of experience programming in a backend language (Java / Python), with a good understanding of troubleshooting errors. 
  • 5+ years of experience in Confluent Kafka / 3+ years of experience in Confluent Kafka 
  • Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security features 
Read more
Netcore Cloud
Mumbai, Navi Mumbai, Bengaluru (Bangalore), Pune
5 - 9 yrs
₹10L - ₹35L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
RabbitMQ
Cassandra
+3 more

Job Title -Senior Java Developers

Job Description - Backend Engineer - Lead (Java)

Mumbai, India | Engineering Team | Full-time

 

Are you passionate enough to be a crucial part of a highly analytical and scalable user engagement platform?

Are you ready learn new technologies and willing to step out of your comfort zone to explore and learn new skills?

 

If so, this is an opportunity for you to join a high-functioning team and make your mark on our organisation!

 

The Impact you will create:

  • Build campaign generation services which can send app notifications at a speed of 10 million a minute
  • Dashboards to show Real time key performance indicators to clients
  • Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
  • Building highly available & horizontally scalable platform services for ever growing data
  • Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
  • Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
  • You will build backend services and APIs to create scalable engineering systems.
  • As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
  • You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
  • Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
  • Identify and improvise areas of improvement through data insights and research.

 

What we look for?

  • 5-9 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
  • Solid understanding of engineering best practices, continuous integration, and incremental delivery.
  • Strong analytical skills, debugging and troubleshooting skills, product line analysis.
  • Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
  • Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
  • Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
  • Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
  • Knowledge about versioning like Git and deployment processes like CICD.

What’s in it for you?

 

  • Immense growth, continuous learning and deliver the best to the top-notch brands
  • Work with some of the most innovative brains
  • Opportunity to explore your entrepreneurial mind-set
  • Open culture where your creative bug gets activated.

 

If this sounds like a company you would like to be a part of, and a role you would thrive in, please don’t hold back from applying! We need your unique perspective for our continued innovation and success!

So let’s converse! Our inquisitive nature is all keen to know more about you.

Skills

JAVA, MONGO, Redis, Cassandra, Kafka, rabbitMQ


 

Read more
Neo Aid
Nandini Sharma
Posted by Nandini Sharma
Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹15L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Microservices
+9 more

B.Tech./ BE - Computer, IT, Electronics only

Requirements:

  • 3+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving

Skills:

  • 3+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.
  • Java
    Agile and Kafka
    Microservices
    Springboot
    NoSQL/MongoDB
    Scrum
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Rajeshwari TS
Posted by Rajeshwari TS
Bengaluru (Bangalore)
6 - 11 yrs
₹1L - ₹14L / yr
skill iconJava
Apache Kafka
Multithreading
Spring
NOSQL Databases

Skill sets :

 

Core Java

Multithreading

Spring

Collections

Kafka

No SQL experience / Mongo / Cassandra ( nice to have )

 

Location: preferably Bangalore / open to Mumbai on review 

 

Experience: 7-10 yrs

Open positions: 2

Notice period: Immediate - 30 days 

Read more
Cornertree

at Cornertree

1 recruiter
Swapnil Biswas
Posted by Swapnil Biswas
Bengaluru (Bangalore)
3 - 9 yrs
₹4L - ₹8L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Data Structures
+25 more
We are looking for a strong Java Developer to join our team! As a Java Developer, you will have to have a strong under-
standing of Java and the different frameworks like Spring, etc., and have experience working on Cloud and Containers.

The Developer will perform duties and tasks to support a complete life cycle management (example: Analysis, Technical
Requirements, Design, Coding, Testing and implementation of Systems, etc.).
The Developer will work closely with the Product and Technical teams across different regions primarily Europe and will

be part of an Agile Team. The role includes research and Continuous Development of new Products based on new Tech-
nologies. This position collaborates with the operations team routinely and henceforth excellent English communication

skills (bothwritten and verbal) are essential.
 A clean coder who will always leave the code in better shape than they found it.
 A curious person who never stops learning and loves to try new things, even when theydon’t succeed on the
first try
 A team-oriented developer with the motivation to bring out the best in others
 A person who shares our appreciation for transparency and is willing to share theirexperience and knowledge
for the benefit of the team
 Someone who is willing to take a stand for something they believe in.
 Somebody that takes pride in their work and knows that development is a craftsmanship
Duties & Responsibilities
 Conducts systems and requirements analysis, creates and contributes to task lists, cost and time analysis
 Performs assigned functions and tasks to meet project plan and quality review requirements.
 Raises issues as appropriate to support effective resolutions.
 Analyzes specifications and user requirements to perform assigned applications development work.
 Assists with system and componentdesigns to meet requirements.
 Participates and documents design and code reviews to improve quality.
 Analyzes, designs, codes, tests, and documents to develop application software.

 Develops unit tests and unit test plans to deliver quality code.
 Performs applications maintenance and support functions to support problem resolution.
Qualifications:
• Bachelor’s degree in Computer Science or IT related field
• 4-7 years of experience working across different product domains in a product development/engineering role

• Good communication skills necessary to manage business requests and work with different teams across differ-
ent geographies and time-zones; experience working with remote and distributed teams will be an added ad-
vantage

• Hands-on working knowledge and experience is requiredin:
a. Java (Spring, Spring Boot, etc.)
b. Experience working in GCP or AWS or Azure
c. Experience working in Containers & Unix Platforms
d. Relational Databases (PostgreSQL, MySQL, SQL, etc.)
e. Messaging (RabbitMQ, ActiveMQ, Kafka etc.)
f. Agile Methodologies (Scrum, TDD, BDD, etc.)
g. Understanding of Microservices Architecture, Domain Driver Design, Test Driven Development and
Secure Design patterns and architecture is a must
h. Data Structures and Algorithms using Java or other Programing Languages
i. Strong organizational skills
j. Agile Methodologies (Scrum, TDD, BDD, etc.)
• Experience with several of the following tools/technologies is desirable:
a. GIT (Bit Bucket, Gitlab, etc.), Jira, Gradle, Maven, Jenkins, SharePoint, Eclipse/IntelliJ.
b. Multiple Java technologies around Spring, Spring Bootetc.
c. Design Patterns and implementing the Design Patterns
d. Development of Complex Application and System Architectures
e. NoSQL Databases (Redis, Mongo, etc.)
f. Experience working with CI/CD pipelines with for example GitHub Actions.
• Knowledge of the following technologies is a plus:
a. Other Programming Languages (NodeJS, etc.)
b. Continuous Integration and Continuous Delivery Tools like Jenkins, Git, etc.
c. Application Servers like Tomcat, etc.
d. HTML5, CSS, AJAX, React
e. Full stack development
f. Secure Development based on OWASP standards
Read more
Bengaluru (Bangalore), Hyderabad, Pune, Chennai, Jaipur
10 - 14 yrs
₹1L - ₹15L / yr
Ant
Maven
CI/CD
skill iconJenkins
skill iconGitHub
+16 more

DevOps Architect 

Experience:  10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.

Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired

Technical Skillset: Skills Proficiency level

  • Build tools (Ant or Maven) - Expert
  • CI/CD tool (Jenkins or Github CI/CD) - Expert
  • Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
  • Infrastructure As Code (Terraform, Helm charts etc.) - Expert
  • Containerization (Docker, Docker Registry) - Expert
  • Scripting (linux) - Expert
  • Cluster deployment (Kubernetes) & maintenance - Expert
  • Programming (Java) - Intermediate
  • Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
  • Artifactory (JFrog) - Expert
  • Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
  • Ansible, MySQL, PostgreSQL - Intermediate


• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)

Roles and Responsibilities

• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Bengaluru (Bangalore), Chennai, Delhi, Mumbai
5 - 10 yrs
₹1L - ₹10L / yr
Apache Kafka
skill iconSpring Boot
Microservices
skill iconKubernetes
Kafka
Job Position: KAFKA Developer
Relevant Experience: 5+ Years
Payroll Company:  Codersbrain Technology Pvt. Ltd.
Location: 
PAN India
Notice Period: Immediate to 15 Days.
Client: IBM
 
Description:
Total Years of Experience: 5+ yrs Relevant Years of Experience: 5+ yrs Mandatory Skills for screening (Limit to top 5 and include version): KAFKA Good to have (Not Mandatory): Detailed Job Description:   Kafka Developer should have - 4 to 5 years of development experience using Confluent Kafka 4 to 5 years of experience in developing microservices using Springboot and Kafka Should have strong experience in developing CI/CD for Spring boot applications and deploying in the Kubernetes environment.  Experience in Kubernetes is MUST (edited)  Experience in using MQ, Oracle source, and Sink Connectors Experience in Kafka performance testing Nice to have experience in OpenShift Nice to have Kafka troubleshooting skills.
Read more
Play Games24x7

at Play Games24x7

2 recruiters
Agency job
via Zyoin Web Private Limited by Vishali Vashnavi
Bengaluru (Bangalore)
8 - 12 yrs
₹40L - ₹50L / yr
skill iconJava
J2EE
skill iconPostgreSQL
MySQL
skill iconMongoDB
+19 more
Requirements:
• B. E. /B. Tech. in Computer Science or MCA from a reputed university.
• 3.5 plus years of experience in software development, with emphasis on JAVA/J2EE Server side
programming.
• Hands on experience in core Java, multithreading, RMI, socket programing, JDBC, NIO, webservices
and design patterns.
• Knowledge of distributed system, distributed caching, messaging frameworks, ESB etc.
• Experience in Linux operating system and PostgreSQL/MySQL/MongoDB/Cassandra database.
• Additionally, knowledge of HBase, Hadoop and Hive is desirable.
• Familiarity with message queue systems and AMQP and Kafka is desirable.
• Experience as a participant in agile methodologies.
• Excellent written and verbal communication skills and presentation skills.
• This is not a fullstack requirement, we are looking for a purely backend expert.
Read more
Tata Digital Pvt Ltd
Agency job
via Seven N Half by Priya Singh
Bengaluru (Bangalore)
8 - 13 yrs
₹10L - ₹15L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more

 

              Data Engineer

 

-          High Skilled and proficient on Azure Data Engineering Tech stacks (ADF, Databricks)

-          Should be well experienced in design and development of Big data integration platform (Kafka, Hadoop).

-          Highly skilled and experienced in building medium to complex data integration pipelines for Data at Rest and streaming data using Spark.

-          Strong knowledge in R/Python.

-          Advanced proficiency in solution design and implementation through Azure Data Lake, SQL and NoSQL Databases.

-          Strong in Data Warehousing concepts

-          Expertise in SQL, SQL tuning, Data Management (Data Security), schema design, Python and ETL processes

-          Highly Motivated, Self-Starter and quick learner

-          Must have Good knowledge on Data modelling and understating of Data analytics

-          Exposure to Statistical procedures, Experiments and Machine Learning techniques is an added advantage.

-          Experience in leading small team of 6/7 Data Engineers.

-          Excellent written and verbal communication skills

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort