Cutshort logo
Apache Kafka Jobs in Pune

46+ Apache Kafka Jobs in Pune | Apache Kafka Job openings in Pune

Apply to 46+ Apache Kafka Jobs in Pune on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

icon
JISA Softech Pvt
Aarti khatpe
Posted by Aarti khatpe
Pune
3 - 5 yrs
₹14L - ₹18L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Design patterns
+3 more

Job Location: Pune 

Experience: 4- 5 years

Functional Area - IT Software - Application Programming , Maintenance

Role Category : Programming & Design

 

Requirement / Job Description:

 

 Core Skills:

 Strong experience of Core Java (1.7 or higher), OOPS concepts and Spring framework (Core,     AOP, Batch, JMS)

 Demonstrated design using Web Services (SOAP and REST)

 Demonstrated Microservices APIs design experience using Spring, Springboot

 Demonstrable experience in Databases like MySQL, PostgreSQL, Oracle PL/SQL development etc

Strong coding skills, good analytical and problem-solving skills

Excellent understanding of Authentication, Identity Management, REST APIs, security and best practices

 Good understanding of web servers like Tomcat Apache, nginx or Vertex/ Grizzly, JBoss etc

 Experience in OAuth principles

 Strong understanding of various Design patterns

 

Other Skills:

  Familiarity with Java Cryptography Architecture (JCA)

 Understanding of API Gateways like Zuul, Eureka Server etc..

 Familiarity with Apache Kafka, MQTT etc.

 

Responsibilities:

 Design, develop, test and debug software modules for an enterprise security product

 Find areas of optimization and produce high quality code

 Collaborate with product managers and other members of the project team in requirements specification and detailed engineering analysis.

 Collaborate with various stake holders and help bring proactive closure on the issues

 Evaluate various technology trends and bring in the best practices

 Innovate and come out of the box solutions

Adapt, thrive and deliver in a highly evolving and demanding product development team

Come up with ways to provide an improved customer experience


Read more
Technogise Private Limited

at Technogise Private Limited

1 video
3 recruiters
Parag Shinde
Posted by Parag Shinde
Pune
5 - 8 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Test driven development (TDD)
+1 more

How do Technogisers function?

Value: Exploring technologies and implementing them on the projects provided they make business sense and deliver value.

Engagement: Be it offshore or onshore, we engage ourselves daily with the clients. This assists in building a trustworthy relationship at the same time, collaborating to come up with strategic solutions to business problems.

Solution: We are involved in providing hands-on contributions towards Backend & Front-end design and development at the same time, flourishing our DevOps culture.

Thought Leadership: Attend or present technical meet-ups/workshops/conferences to share knowledge and help build Technogise brand.


How can you become a Technogiser?

 

Core Skills:

  • A thorough understanding of at least one technology stack is the go-to person for any problems related to this
  • Experience: 5 to 8 years of Java experience
  • Should have knowledge about Kafka
  • Should have worked on Springboot, Microservices
  • Should be able to write the test cases
  • Influence technical decision-making and high-level design decisions - choice of frameworks and tech approach
  • Demonstrate the ability to understand different approaches for application, and integration and influence decisions by making appropriate trade-offs

 

Ways of working:

  • You communicate effectively with other roles in the project at the team and client levels.
  • You drive discussions effectively at the team and client levels. Encourage others to participate.

 

Going beyond

  • Establish credibility within the team as a result of technical and leadership skills
  • Mentoring fellow team members within the project team and providing technical guidance to others beyond project boundaries.
  • Build and own Growth framework of people in the project team.
  • Actively participate in organizational activities.

Tech stack: We are polyglots so focus on varied technologies.

Java,Node, Mongodb, Microservices, Go lang, Ruby, Ruby on rails.

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
1 - 3 yrs
Best in industry
MongoDB
Big Data
Apache Kafka
Spring MVC
Spark
+1 more

With a core belief that advertising technology can measurably improve the lives of patients, DeepIntent is leading the healthcare advertising industry into the future. Built purposefully for the healthcare industry, the DeepIntent Healthcare Advertising Platform is proven to drive higher audience quality and script performance with patented technology and the industry’s most comprehensive health data. DeepIntent is trusted by 600+ pharmaceutical brands and all the leading healthcare agencies to reach the most relevant healthcare provider and patient audiences across all channels and devices. For more information, visit DeepIntent.com or find us on LinkedIn.


What You’ll Do:

  • Ensure timely and top-quality product delivery
  • Ensure that the end product is fully and correctly defined and documented
  • Ensure implementation/continuous improvement of formal processes to support product development activities
  • Drive the architecture/design decisions needed to achieve cost-effective and high-performance results
  • Conduct feasibility analysis, produce functional and design specifications of proposed new features.
  • Provide helpful and productive code reviews for peers and junior members of the team.
  • Troubleshoot complex issues discovered in-house as well as in customer environments.


Who You Are:

  • Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc.
  • Expertise in Java, Object Oriented Programming, Design Patterns
  • Experience in coding and implementing scalable solutions in a large-scale distributed environment
  • Working experience in a Linux/UNIX environment is good to have
  • Experience with relational databases and database concepts, preferably MySQL
  • Experience with SQL and Java optimization for real-time systems
  • Familiarity with version control systems Git and build tools like Maven
  • Excellent interpersonal, written, and verbal communication skills
  • BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent


The set of skills we are looking for:

  • MongoDB
  • Big Data
  • Apache Kafka 
  • Spring MVC 
  • Spark 
  • Java 


DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.

DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.

DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Indrajeet Deshmukh
Posted by Indrajeet Deshmukh
Pune
3 - 8 yrs
Best in industry
MongoDB
Big Data
Apache Kafka
Spring MVC
Spark
+1 more

With a core belief that advertising technology can measurably improve the lives of patients, DeepIntent is leading the healthcare advertising industry into the future. Built purposefully for the healthcare industry, the DeepIntent Healthcare Advertising Platform is proven to drive higher audience quality and script performance with patented technology and the industry’s most comprehensive health data. DeepIntent is trusted by 600+ pharmaceutical brands and all the leading healthcare agencies to reach the most relevant healthcare provider and patient audiences across all channels and devices. For more information, visit DeepIntent.com or find us on LinkedIn.


What You’ll Do:

  • Ensure timely and top-quality product delivery
  • Ensure that the end product is fully and correctly defined and documented
  • Ensure implementation/continuous improvement of formal processes to support product development activities
  • Drive the architecture/design decisions needed to achieve cost-effective and high-performance results
  • Conduct feasibility analysis, produce functional and design specifications of proposed new features.
  • Provide helpful and productive code reviews for peers and junior members of the team.
  • Troubleshoot complex issues discovered in-house as well as in customer environments.


Who You Are:

  • Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc.
  • Expertise in Java, Object Oriented Programming, Design Patterns
  • Experience in coding and implementing scalable solutions in a large-scale distributed environment
  • Working experience in a Linux/UNIX environment is good to have
  • Experience with relational databases and database concepts, preferably MySQL
  • Experience with SQL and Java optimization for real-time systems
  • Familiarity with version control systems Git and build tools like Maven
  • Excellent interpersonal, written, and verbal communication skills
  • BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent


The set of skills we are looking for:

  • MongoDB
  • Big Data
  • Apache Kafka 
  • Spring MVC 
  • Spark 
  • Java 


DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together.

DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance.

DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.


Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Diksha Kalucha
Posted by Diksha Kalucha
Pune, Hyderabad, Gurugram
2.5 - 5 yrs
Best in industry
Spark
PySpark
Data engineering
Big Data
Hadoop
+6 more

DATA ENGINEER – CONSULTANT


Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to

understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions or perhaps overseeing a program inception to build a new product. It could also be a software

delivery project where you're equally happy coding and tech-leading the team to implement the solution.


Job Responsibilities

• You will partner with teammates to create complex data processing pipelines to solve our clients' most complex challenges

• You will collaborate with Data Scientists to design scalable implementations of their models

• You will pair to write clean and iterative code based on TDD

• Leverage various continuous delivery practices to deploy, support and operate data pipelines

• Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

• Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

• Create data models and speak to the tradeoffs of different modelling approaches

• Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

• Assure effective collaboration between Thoughtworks and the client's teams, encouraging open communication and advocating for shared outcomes


Job Qualifications


Technical skills

• You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop

• You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

• Hands-on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

• You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

• Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems

• You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments


Professional skills

• You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

• An interest in coaching, sharing your experience and knowledge with teammates

• You enjoy influencing others and always advocate for technical excellence while being open to change when needed

• Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more


Other things to know


Learning & Development


There is no one-size-fits-all career path at Thoughtworks: however you want to develop your career is entirely up to you. But we also balance autonomy with the strength of our

cultivation culture. This means your career is supported by interactive tools, numerous development programs and teammates who want to help you grow. We see value in helping each other be our best and that extends to empowering our employees in their career

journeys.


About Thoughtworks

Thoughtworks is a global technology consultancy that integrates strategy, design and engineering to drive digital innovation. For over 30 years, our clients have trusted our autonomous teams to build solutions that look past the obvious. Here, computer science

grads come together with seasoned technologists, self-taught developers, midlife career changers and more to learn from and challenge each other. Career journeys flourish with the strength of our cultivation culture, which has won numerous awards around the world.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune, Mumbai, Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹20L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Messaging
+5 more

WISSEN TECHNOLOGY is Hiring!!!!!


Java Developer with messaging experience (JMS/EMS/Kafka/RabbitMQ) And CI/CD.


Exp-3 to 6 yrs

Location-Pune|Mumbai|Bangalore

NP- Serving and less than 15 days only.

Requirement:

Core Java 8.0

Mandatory experience in any of the messaging technologies like JMS/EMS/Kafka/RabbitMQ

Extensive experience in developing enterprise-scale n-tier applications for the financial domain.

Should possess good architectural knowledge and be aware of enterprise application design patterns. 

Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.

Mandatory development experience on CI/CD platform.

Good knowledge of multi-threading and high-volume server side development

Experience in sales and trading platforms in investment banking/capital markets

Basic working knowledge of Unix/Linux. 

Strong written and oral communication skills. Should have the ability to express their design ideas and thoughts.

Read more
Egen Solutions
Anshul Saxena
Posted by Anshul Saxena
Remote, Hyderabad, Ahmedabad, Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Kolkata, Indore, Bhopal, Kochi (Cochin), Chennai, Bengaluru (Bangalore), Pune
3 - 5 yrs
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
Kotlin
+3 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.


You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.


Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Ramya S
Posted by Ramya S
Pune, Hyderabad, Chennai, Gurugram
3 - 5 yrs
Best in industry
Spark
PySpark
Data engineering
Big Data
Hadoop
+6 more

DATA ENGINEER – CONSULTANT


Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to

understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions or perhaps overseeing a program inception to build a new product. It could also be a software

delivery project where you're equally happy coding and tech-leading the team to implement the solution.


Job Responsibilities

• You will partner with teammates to create complex data processing pipelines to solve our clients' most complex challenges

• You will collaborate with Data Scientists to design scalable implementations of their models

• You will pair to write clean and iterative code based on TDD

• Leverage various continuous delivery practices to deploy, support and operate data pipelines

• Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

• Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

• Create data models and speak to the tradeoffs of different modelling approaches

• Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

• Assure effective collaboration between Thoughtworks and the client's teams, encouraging open communication and advocating for shared outcomes


Job Qualifications


Technical skills

• You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop

• You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

• Hands-on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

• You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

• Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems

• You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments


Professional skills

• You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

• An interest in coaching, sharing your experience and knowledge with teammates

• You enjoy influencing others and always advocate for technical excellence while being open to change when needed

• Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more


Other things to know


Learning & Development


There is no one-size-fits-all career path at Thoughtworks: however you want to develop your career is entirely up to you. But we also balance autonomy with the strength of our

cultivation culture. This means your career is supported by interactive tools, numerous development programs and teammates who want to help you grow. We see value in helping each other be our best and that extends to empowering our employees in their career

journeys.


About Thoughtworks

Thoughtworks is a global technology consultancy that integrates strategy, design and engineering to drive digital innovation. For over 30 years, our clients have trusted our autonomous teams to build solutions that look past the obvious. Here, computer science

grads come together with seasoned technologists, self-taught developers, midlife career changers and more to learn from and challenge each other. Career journeys flourish with the strength of our cultivation culture, which has won numerous awards around the world.

Read more
Pune
6 - 10 yrs
₹5L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Amazon Web Services (AWS)
+3 more

Description:


  • Understanding of cloud infrastructure including containerization
  • Stong exp in Microservices
  • Exposure to architectural patterns of large, high-scale applications.
  • Should be comfortable with Cloud native and cloud agnostic architecture patterns -
  • Hands-on experience designing, building, and deploying secure, scalable services in the Cloud and SaaS environments. -
  • Familiarity with Disaster Recovery setup, Multi AZ, active-active and active-passive implementations. -
  • Knowledge of load balancers, Route 53, Subnets, NACL etc. and other AWS components -
  • Do the POC implementation and prove the architecture incase any new tools or technology is used and leverage the existing tools without reinventing the wheel.
  • Must have an engineering mindset and readiness to code.
  • Able to architect the solution keeping consistency, reliability and maintenance in mind.
  • Fluent open communication style – speaking, writing, collaborating.


Read more
Zycus

at Zycus

10 recruiters
Nafis Kurne
Posted by Nafis Kurne
Pune, Mumbai, Bangalore
14 - 26 yrs
₹25L - ₹55L / yr
Vue.js
AngularJS (1.x)
Angular (2+)
React.js
Javascript
+18 more

EXPERTISE AND QUALIFICATIONS

  • 14+ years of experience in Software Engineering with at least 6+ years as a Lead Enterprise Architect preferably in a software product company
  • High technical credibility - ability to lead technical brainstorming, take decisions and push for the best solution to a problem
  • Experience in architecting Microservices based E2E Enterprise Applications
  • Experience in UI technologies such as Angular, Node.js or Fullstack technology is desirable
  • Experience with NoSQL technologies (MongoDB, Neo4j etc.)
  • Elastic Search, Kibana, ELK, Logstash.
  • Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
  • Exposure in SaaS cloud-based platform.
  • Experience on Docker, Kubernetes etc.
  • Experience in planning, designing, developing and delivering Enterprise Software using Agile Methodology
  • Key Programming Skills: Java, J2EE with cutting edge technologies
  • Hands-on technical leadership with proven ability to recruit and mentor high performance talents including Architects, Technical Leads, Developers
  • Excellent team building, mentoring and coaching skills are a must-have
  • A proven track record of consistently setting and achieving high standards

Five Reasons Why You Should Join Zycus

1. Cloud Product Company: We are a Cloud SaaS Company, and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React.

2. A Market Leader: Zycus is recognized by Gartner (world’s leading market research analyst) as a Leader in Procurement Software Suites.

3. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization

4. Get a Global Exposure: You get to work and deal with our global customers.

5. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.


About Us

Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users.


Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-

to-use user interface ensures high adoption and value across the organization.


Start your #CognitiveProcurement journey with us, as you are #MeantforMore

Read more
Concinnity Media Technologies

at Concinnity Media Technologies

2 candid answers
Anirban Biswas
Posted by Anirban Biswas
Pune
7 - 12 yrs
₹12L - ₹21L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Amazon Web Services (AWS)
+3 more

Job Code: CSAW0004


Candidate Experience:

Having 7+ years of relevant experience


Skills and Qualifications.

● Experience in Design, Development of Java in IOT based Projects

● Exposure to AWS (or any other Cloud Platform).

● Assisting the software design team with application development and

integration.

● Ability to solve complex software system issues.

● Experience in handling database queries and database.

● Programming languages and framework - Java, Java Beans, spring boot, spring MVC.

● Experience in message MQ or streaming framework such as apache KAFKA or Active MQ.

● Experience with Docker and Kubernetes

● Passionate and Enthusiastic about work.

● Technical team leadership experience.

Proactive attitude.

● To be a bridge between the team and the counterparts with regards to technical aspects of the project.

● Good communication skills.

● Exposure to Microservices


Education:

Bachelor of Engineering/Technology - BE/BTech

Read more
Samsan Technologies

at Samsan Technologies

1 recruiter
HR Varsha
Posted by HR Varsha
Pune
3 - 7 yrs
₹1L - ₹10L / yr
NodeJS (Node.js)
React.js
Angular (2+)
AngularJS (1.x)
MongoDB
+11 more

Job Responsibilities

·        Responsibilities for this position include but are not limited to, the following.

·        Development experience 3-6 years

·        Experience working with Azure cloud-hosted web applications and technologies.

·        Design and develop back-end microservices and REST APIs for connected devices, web applications, and mobile applications.

·        Stay up to date on relevant technologies, plug into user groups, and understand trends and opportunities that ensure we are using the best techniques and tools.

  • Meeting with the software development team to define the scope and scale of software projects.
  • Designing software system architecture.
  • Completing data structures and design patterns.
  • Designing and implementing scalable web services, applications, and APIs.
  • Developing and maintaining internal software tools.
  • Writing low-level and high-level code.
  • Troubleshooting and bug fixing.
  • Identifying bottlenecks and improving software efficiency.
  • Collaborating with the design team on developing micro-services.
  • Writing technical documents.
  • Be an active professional in continuous learning resulting in enhancement in organizational objectives.
  • Provide technical support to all internal teams and customers as it relates to the product.

Requirements:

  • Bachelor’s degree in computer engineering or computer science.
  • Previous experience as a full stack engineer and IoT Products.
  • Advanced knowledge of front-end languages including HTML5, CSS, JavaScript, Angular, React.
  • Proficient in back-end languages including Nodejs and basic knowledge of Java, C#.
  • Experience with cloud computing APIs and Cloud Providers such as Azure or AWS.

·        Working knowledge of database systems (Cassandra, CosmosDB, Redis, PostgreSQL)

·        Messaging systems (RabbitMQ, MQTT, Kafka)

·        Cloud-based distributed application scaling & data processing in the cloud

·        Agile / Scrum methodology

  • Advanced troubleshooting skills.
  • Familiarity with JavaScript frameworks.
  • Good communication skills.

High-level project management skills.

Read more
Pune
7 - 11 yrs
₹25L - ₹33L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Microservices
+12 more

Hi,

We are hiring for the position of Java Tech Lead. Please find below the details for the same.


A passionate developer who has a strong working knowledge of OOPS and functional programming

principles. Standard Definitions and abbreviations don't entice us that much.

Key skills:

• Strong Java and J2EE background with 5-7 years of experience.

• Strong working experience in Multi-Threading, Exception Management and the Use of Collections.

• Sound knowledge of working with application aspects i.e., Caching, Asynchronous APIs, Logging etc.

• Experience with web application frameworks like Spring Boot or Dropwizard.

• Unit Testing is an everyday affair and hence demands very good unit testing skills using tools like Junit & TestNG.

• Understanding of relational databases, RESTful services, and build tools like Maven & Gradle

• Knows what and when to mock and has used frameworks like Mockito/Power Mock.

• Understanding of message queues such as ActiveMQ, Kafka, and RabbitMQ.

•  Version Control is treated as important as programming skills. Fluent with version control tools like Git and Bitbucket.

• Exposure to Agile/Scrum, TDD not in theory but in practice.

•  Experience with Continuous Integration, Continuous Deployment, Static Code Analysis, Jenkins and SonarQube.

•  Willingness to take ownership of the technical solution and ensure technical expectations of deliverables are met.

• Strong communication skills along with the ability to articulate technical designs and concepts.

• Exposure to cloud and containerization would be a plus.

• Hands-on experience in application development in an enterprise setup.

• Have a good understanding of Distributed Application Architecture.

Read more
Zycus

at Zycus

10 recruiters
Viren Bhuptani
Posted by Viren Bhuptani
Mumbai, Pune, Bengaluru (Bangalore)
15 - 25 yrs
Best in industry
Microservices
J2EE
Spring Boot
Java
Hibernate (Java)
+3 more

We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions.

  • As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team.
  • You will have to collaborate with Product Management and Implementation teams and build a commercially successful product.
  • You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands on engineering leadership.
  • Requirement deep technical knowledge in Software Product Engineering using Amazon Web Services,Java 8 Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, docker, kubernetes, Amazon Web Services ,Architecture Concepts,Design PatternsData Structures & Algorithms,Distributed Computing,Multi-threading,AWS,Docker,Kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot, etc. is a must.



  • 16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company.
  • Hands-on technical leadership with proven ability to recruit high performance talent
  • High technical credibility - ability to audit technical decisions and push for the best solution to a problem.
  • Experience building E2E Application right from backend database to persistent layer.
  • Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred.
  • Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.)
  • Elastic Search, Kibana, ELK, Logstash.
  • Experience in developing Enterprise Software using Agile Methodology.
  • Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
  • SaaS cloud-based platform exposure.
  • Experience on Docker, Kubernetes etc.
  • Ownership E2E design development and also quality enterprise product/application deliverable exposure
  • A track record of setting and achieving high standards
  • Strong understanding of modern technology architecture
  • Key Programming Skills: Java, J2EE with cutting edge technologies
  • Excellent team building, mentoring and coaching skills are a must-have


Read more
Recro

at Recro

1 video
32 recruiters
Amrita Singh
Posted by Amrita Singh
Bengaluru (Bangalore), Pune, Noida
3 - 6 yrs
₹6L - ₹20L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Microservices
+4 more

Requirements

  • 3+ years of experience in the Development of JAVA technology.
  • Strong Java Basics
  • Linux
  • SpringBoot or Spring MVC
  • Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Java 8
  • Any Caching Mechanism
  • Good at problem-solving


Good to Have Skills:

  • 3 years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem-solving skills.
  • Ability to work in a fast-paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding AI/ML algorithms is a plus.


Read more
Concentric AI

at Concentric AI

7 candid answers
1 product
Gopal Agarwal
Posted by Gopal Agarwal
Pune
3 - 12 yrs
₹10L - ₹60L / yr
Java
Spring Boot
Apache Kafka
NOSQL Databases
RESTful APIs
+1 more

Requirements:

  • Energetic self-starter, with a desire to work in a startup environment.
  • Proficient in advanced Java programming skills.
  • Expert in Application development cloud/on premise end to end. Middle layer, DB layer.
  • Nice to have understanding on MQ and DB
  • Good hands on in Complex Event Processing systems.
  • Solved scale and performance issues while dealing with huge sets of data. Pre compute or data aggregation frameworks to achieve good response time.
  • Real world experience working with large datasets and NoSQL database technologies
  • Experience of debugging applications running on Unix like systems (e.g. Ubuntu, CentOS)
  • Experience developing RESTful APIs for complex data sets
  • Knowledge of container based development & deployment (e.g. Dockers, rkt)
  • Expertise in software security domain, a plus


Read more
Telstra

at Telstra

1 video
1 recruiter
Mahesh Balappa
Posted by Mahesh Balappa
Bengaluru (Bangalore), Hyderabad, Pune
3 - 7 yrs
Best in industry
Spark
Hadoop
NOSQL Databases
Apache Kafka

About Telstra

 

Telstra is Australia’s leading telecommunications and technology company, with operations in more than 20 countries, including In India where we’re building a new Innovation and Capability Centre (ICC) in Bangalore.

 

We’re growing, fast, and for you that means many exciting opportunities to develop your career at Telstra. Join us on this exciting journey, and together, we’ll reimagine the future.

 

Why Telstra?

 

  • We're an iconic Australian company with a rich heritage that's been built over 100 years. Telstra is Australia's leading Telecommunications and Technology Company. We've been operating internationally for more than 70 years.
  • International presence spanning over 20 countries.
  • We are one of the 20 largest telecommunications providers globally
  • At Telstra, the work is complex and stimulating, but with that comes a great sense of achievement. We are shaping the tomorrow's modes of communication with our innovation driven teams.

 

Telstra offers an opportunity to make a difference to lives of millions of people by providing the choice of flexibility in work and a rewarding career that you will be proud of!

 

About the team

Being part of Networks & IT means you'll be part of a team that focuses on extending our network superiority to enable the continued execution of our digital strategy.

With us, you'll be working with world-leading technology and change the way we do IT to ensure business needs drive priorities, accelerating our digitisation programme.

 

Focus of the role

Any new engineer who comes into data chapter would be mostly into developing reusable data processing and storage frameworks that can be used across data platform.

 

About you

To be successful in the role, you'll bring skills and experience in:-

 

Essential 

  • Hands-on experience in Spark Core, Spark SQL, SQL/Hive/Impala, Git/SVN/Any other VCS and Data warehousing
  • Skilled in the Hadoop Ecosystem(HDP/Cloudera/MapR/EMR etc)
  • Azure data factory/Airflow/control-M/Luigi
  • PL/SQL
  • Exposure to NOSQL(Hbase/Cassandra/GraphDB(Neo4J)/MongoDB)
  • File formats (Parquet/ORC/AVRO/Delta/Hudi etc.)
  • Kafka/Kinesis/Eventhub

 

Highly Desirable

Experience and knowledgeable on the following:

  • Spark Streaming
  • Cloud exposure (Azure/AWS/GCP)
  • Azure data offerings - ADF, ADLS2, Azure Databricks, Azure Synapse, Eventhubs, CosmosDB etc.
  • Presto/Athena
  • Azure DevOps
  • Jenkins/ Bamboo/Any similar build tools
  • Power BI
  • Prior experience in building or working in team building reusable frameworks,
  • Data modelling.
  • Data Architecture and design principles. (Delta/Kappa/Lambda architecture)
  • Exposure to CI/CD
  • Code Quality - Static and Dynamic code scans
  • Agile SDLC      

 

If you've got a passion to innovate, succeed as part of a great team, and looking for the next step in your career, we'd welcome you to apply!

___________________________

 

We’re committed to building a diverse and inclusive workforce in all its forms. We encourage applicants from diverse gender, cultural and linguistic backgrounds and applicants who may be living with a disability. We also offer flexibility in all our roles, to ensure everyone can participate.

To learn more about how we support our people, including accessibility adjustments we can provide you through the recruitment process, visit tel.st/thrive.

Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
phani kalyan
Posted by phani kalyan
Pune
5 - 8 yrs
₹9L - ₹17L / yr
Splunk
Python
Visual Studio
Bitbucket
Apache Kafka
+2 more

Enterprise Minds, with core focus on engineering products, automation and intelligence, partners customers on the trajectory towards increasing outcomes, relevance, and growth.

Harnessing the power of Data and the forces that define AI, Machine Learning and Data Science, we believe in institutionalizing go-to-market models and not just explore possibilities.

We believe in a customer-centric ethic without and people-centric paradigm within. With a strong sense of community, ownership, and collaboration our people work in a spirit of co-creation, co-innovation and co-development to engineer next-generation software products with the help of accelerators.

Through Communities we connect and attract talent that shares skills and expertise. Through Innovation Labs and global design studios we deliver creative solutions.
We create vertical isolated pods which has narrow but deep focus. We also create horizontal pods to collaborate and deliver sustainable outcomes.

We follow Agile methodologies to fail fast and deliver scalable and modular solutions. We constantly self-asses and realign to work with each customer in the most impactful manner.

Pre-requisites for the Role

1.Job ID-EMSP0120PS

  1. Primary skill:
  • Splunk Development and Administration

 

  1. Secondary skill:

Python, Splunk DB connect, Visual Studio (C#), BitBucket, Kafka, Devops tools. 

  1. 4. Years of Experience: 5-8 Years
  2. Location:(Hybrid Model)-Pune
  3. Position-1
  4. Budget- - 5-6 years (Max up to 17 LPA) and 6-8 Years (Max up to 22 LPA)
  5. NP- Immediate

 

Primary Role & Responsibility:

As a software engineer, your daily work involves technically challenging applications and projects where your code makes a direct contribution to the further development and upkeep of our software suite and to its application in projects.

You should be able to create Splunk dashboards, apps and should have good understanding of source interfaces for Splunk.

You should have idea of onboarding of data from different sources like JSON, XML, syslog, errorlog files.

As a software engineer, we expect much more from you than just the ability to design and develop good software. We find it important that you possess an inherent drive to get the best out of yourself every day, that you are inquisitive and that you are not intimidated by situations which require you to branch off from the beaten track. You work together with colleagues in a SCRUM team. In addition, you have regular contact with other software teams, software architects, testers and end users. Good communication skills are therefore extremely important, as well as the ability to think pro-actively and suggest possible improvements. This gives you every opportunity to contribute your personal input and grow and develop within the department.

The often complex functionality of the software includes business logic, controls for logistical transport, communication with external computer systems, reporting, data analysis and simulation. This functionality is spread across various components. You design, program and test the software based on a design concept and a set of requirements. In some cases, you will have to personally formulate these requirements together with the (end) users and / or internal stakeholders. Learn more about the Software modular stack

  

Desired Profile & Experience: Knowledge of Kafka and experience with Java

  • Splunk Architecture, on-premise and cloud based deployment.
  • IoT edge.
  • Analytical skills and capabilities to understand how raw (unstructured) data needs to be transformed into processed information.
Read more
Pune
0 - 1 yrs
₹10L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
SQL
+6 more
1. Work closely with senior engineers to design, implement and deploy applications that impact the business with an emphasis on mobile, payments, and product website development
2. Design software and make technology choices across the stack (from data storage to application to front-end)
3. Understand a range of tier-1 systems/services that power our product to make scalable changes to critical path code
4. Own the design and delivery of an integral piece of a tier-1 system or application
5. Work closely with product managers, UX designers, and end users and integrate software components into a fully functional system
6. Work on the management and execution of project plans and delivery commitments
7. Take ownership of product/feature end-to-end for all phases from the development to the production
8. Ensure the developed features are scalable and highly available with no quality concerns
9. Work closely with senior engineers for refining and implementation
10. Manage and execute project plans and delivery commitments
11. Create and execute appropriate quality plans, project plans, test strategies, and processes for development activities in concert with business and project management efforts
Read more
Pune
5 - 9 yrs
₹5L - ₹15L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more
This role is for a developer with strong core application or system programming skills in Scala, java and
good exposure to concepts and/or technology across the broader spectrum. Enterprise Risk Technology
covers a variety of existing systems and green-field projects.
A Full stack Hadoop development experience with Scala development
A Full stack Java development experience covering Core Java (including JDK 1.8) and good understanding
of design patterns.
Requirements:-
• Strong hands-on development in Java technologies.
• Strong hands-on development in Hadoop technologies like Spark, Scala and experience on Avro.
• Participation in product feature design and documentation
• Requirement break-up, ownership and implantation.
• Product BAU deliveries and Level 3 production defects fixes.
Qualifications & Experience
• Degree holder in numerate subject
• Hands on Experience on Hadoop, Spark, Scala, Impala, Avro and messaging like Kafka
• Experience across a core compiled language – Java
• Proficiency in Java related frameworks like Springs, Hibernate, JPA
• Hands on experience in JDK 1.8 and strong skillset covering Collections, Multithreading with

For internal use only
For internal use only
experience working on Distributed applications.
• Strong hands-on development track record with end-to-end development cycle involvement
• Good exposure to computational concepts
• Good communication and interpersonal skills
• Working knowledge of risk and derivatives pricing (optional)
• Proficiency in SQL (PL/SQL), data modelling.
• Understanding of Hadoop architecture and Scala program language is a good to have.
Read more
Accion Labs

at Accion Labs

14 recruiters
Jayasri Palanivelu
Posted by Jayasri Palanivelu
Bengaluru (Bangalore), Hyderabad, Pune, Mumbai
6 - 10 yrs
₹15L - ₹30L / yr
Java
Spring Boot
Hibernate (Java)
Microservices
NOSQL Databases
+3 more

Desired Candidate Profile


  • A team focus with strong collaboration and communication skills
  • Exceptional ability to quickly grasp high-level business goals, derive requirements, and translate them into effective technical solutions
  • Exceptional object-oriented thinking, design and programming skills (Java 8 or 11)
  • Expertise with the following technologies : Data Structures, Design Patterns ,Code Versioning Tools(Github/bitbucket/..), XML, JSON, Spring Batch Restful, Spring Cloud, Grafana(Knowledge/Experience), Kafka, Spring Boot, Microservices, DB/NoSQL, Docker, Kubernetes, AWS/GCP, Architecture design (Patterns) Agile, JIRA.
  • Penchant toward self-motivation and continuous improvement; these words should describe you: dedicated, energetic, curious, conscientious, and flexible.
Read more
Paytm

at Paytm

41 recruiters
Anuj Kanojia
Posted by Anuj Kanojia
Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Mumbai, Bengaluru (Bangalore), Pune
9 - 15 yrs
Best in industry
J2EE
Spring Boot
Java
Microservices
Apache Kafka
About Us: 
Paytm is India’s leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks’ financial instruments. To further enhance merchants’ business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. Operating on this platform leverage, the company then offers credit services such as merchant loans, personal loans and BNPL, sourced by its financial partners.  
 
About the role:
As a Principal Engineer, you will help define the technical design and implementation roadmap across multiple solutions and will work with engineering leadership to ensure we resource and equip our squads with the right expertise to deliver those solutions.  
 
Requirements: 
10 to 14 years of strong software design/development experience in building massively large-scale distributed internet systems and products
Hands-on experience in Advance Java, Spring boot, AWS, Node
Experience and knowledge of open source tools & frameworks, broader cutting edge technologies around server-side development
Should be an active contributor to developer communities like Stack Overflow, Top coder, Git Hub, and Google Developer Groups (GDGs). 
Superior organization, communication, interpersonal and leadership skills.
Must be a self-starter who can work well with minimal guidance and in a fluid environment. 
 
Preferred Qualifications: Bachelor's/Master's Degree in Computer Science or equivalent 
 
Skills that will help you succeed in this role: 
Expertise in Java, DB: RDBMS, Messaging: Kafka/RabbitMQ, Caching: Redis/Aerospike, Microservices, AWS
Strong experience in scaling, performance tuning & optimization at both API and storage layers
Problem Solver with a passion for excellence.  
 
Why join us:
Because you get an opportunity to make a difference, and have a great time doing that
You are challenged and encouraged her to do stuff that is meaningful for you and for those we serve
You should work with us if you think seriously about what technology can do for people
We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. 
Learn more about the exciting work we do in Tech by reading our https://paytm.com/blog/engineering/">Engineering blogs
 
Compensation:
If you are the right fit, we believe in creating wealth for you.
With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story! 
Read more
Rishabh Software
Pune, Ahmedabad, Vadodara
9 - 12 yrs
Best in industry
Microservices
Spring Boot
Amazon Web Services (AWS)
Windows Azure
Google Cloud Platform (GCP)
+8 more

An excellent opportunity to develop products SDE 3.

Rishabh Software, an India based IT service provider, focuses on cost-effective, qualitative, and timely delivered Offshore Software Development, Business Process Outsourcing (BPO) and Engineering Services.

Our Core competency lies in developing customized software solutions using web-based and client/server technology. With over 20 years of Software Development Experience working together with various domestic and international companies, we, at Rishabh Software, provide specific solutions as per the client requirements that help industries of different domains to change business problems into strategic advantages.

Product Development division is relatively new and comes with a start-up culture where long path is been and being constructed for developing reliable & scalable product/s.

Through our offices in the US (Silicon Valley), UK (London) and India (Vadodara & Bangalore) we service our global clients with qualitative and well-executed software development, BPO and Engineering services.


Please find the below JD.


Key Responsibilities

  • Responsible to interpret & map business, functional & non-functional requirements to technical specifications
  • Will be interacting with diverse stakeholders like Product Manager/Scrum master, Business Analysts, testing and other cross-functional teams as part of product  development
  • Develop solutions following established technical design, application development standards and quality processes in projects to deliver efficient, reusable, and reliable code
  • Write unit test cases for developed code as required followed by developing solutions for established technical design, application development standards and quality processes in projects to deliver efficient, reusable and reliable code
  • Perform code reviews and mentor fellow team members
  • Follow best practices to ensure the best possible performance, quality, and responsiveness of the applications
  • Assess the impacts on technical design because of the changes in functional requirements
  • Support the Technical Lead/Architect in developing artifacts such as high-level design, technical design, etc.
  • Proactively identify and communicate technical risks, issues, and challenges with mitigations
  • Manage and lead a team proactively providing guidance and mentoring as required


Technical Skills


Mandatory (Minimum 9 years of working experience)

 

  • Well-versed with Architecture and Design patterns.
  • Practice the industry's leading best guidelines/processes in building enterprise products
  • Strong experience in core Java, Spring, Spring boot, Spring Cloud, HTML, CSS, Bootstrap, Javascript, Jquery, JSON, JWT, Multi-Threading, Messaging Frameworks (Kafka, Rabbit MQ, etc.), Microservices, REST, SOAP, gRPC
  • Excellent knowledge of Relational Databases (MySQL, POSTGRES), NoSQL(Cassandra, MongoDB)), and ORM frameworks (JPA, Hibernate)
  • Knowledge of Docker, Kubernetes and containerization. Experience with cloud providers like AWS, and Azure.
  • Hands-on experience in designing and developing products using Java EE platforms, Microservices architecture
  • Experience with RESTful services as well as SOAP-based web services
  • Good knowledge of Java 8 and above with core areas like Streams, Lambdas, Functional Interfaces, Concurrency, Generics, threads, networking, IO, collections
  • Excellent knowledge & experience in microservices
  •  

Preferred 

 

  • Experience in reactive programming- Webflux, Hibernate Reactive. Knowledge of GraphQL
  • Java testing frameworks (JUnit, Mockito, TestNG etc.)
  • Knowledge of CI/CD tools (Jenkins, CruiseControl, Bamboo, etc.) and DevOps
  • Knowledge of build tools (Ant, Maven, Gradle, etc.)
  • Knowledge of BPMN, Rule-based Engine, Search Engine
  • Knowledge of JS framework like Angular

 

You would be part of

  • Exciting journey in building next generation enterprise products
  • Flat organisation structure
  • Enriches both domain and technical skills

 

Soft Skills

  • Good verbal and written communication skills
  • Ability to collaborate and work effectively in a
  • Excellent analytical and logical skills

Education

  • Preferred: Graduate or Post Graduate with specialization related to Computer Science or IT
Read more
PL

at PL

Agency job
Navi Mumbai, Bengaluru (Bangalore), Pune
4 - 10 yrs
₹1L - ₹15L / yr
Apache Kafka
Kafka
Java
Python
  • 3-8+ years of experience programming in a backend language (Java / Python), with a good understanding of troubleshooting errors. 
  • 5+ years of experience in Confluent Kafka / 3+ years of experience in Confluent Kafka 
  • Cloud Kafka, Control Central, Rest Proxy, HA Proxy, Confluent Kafka Connect, Confluent Kafka Security features 
Read more
Netcore Cloud
Mumbai, Navi Mumbai, Bengaluru (Bangalore), Pune
5 - 9 yrs
₹10L - ₹35L / yr
Java
Spring Boot
Apache Kafka
RabbitMQ
Cassandra
+3 more

Job Title -Senior Java Developers

Job Description - Backend Engineer - Lead (Java)

Mumbai, India | Engineering Team | Full-time

 

Are you passionate enough to be a crucial part of a highly analytical and scalable user engagement platform?

Are you ready learn new technologies and willing to step out of your comfort zone to explore and learn new skills?

 

If so, this is an opportunity for you to join a high-functioning team and make your mark on our organisation!

 

The Impact you will create:

  • Build campaign generation services which can send app notifications at a speed of 10 million a minute
  • Dashboards to show Real time key performance indicators to clients
  • Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
  • Building highly available & horizontally scalable platform services for ever growing data
  • Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
  • Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
  • You will build backend services and APIs to create scalable engineering systems.
  • As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
  • You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
  • Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
  • Identify and improvise areas of improvement through data insights and research.

 

What we look for?

  • 5-9 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
  • Solid understanding of engineering best practices, continuous integration, and incremental delivery.
  • Strong analytical skills, debugging and troubleshooting skills, product line analysis.
  • Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
  • Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
  • Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
  • Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
  • Knowledge about versioning like Git and deployment processes like CICD.

What’s in it for you?

 

  • Immense growth, continuous learning and deliver the best to the top-notch brands
  • Work with some of the most innovative brains
  • Opportunity to explore your entrepreneurial mind-set
  • Open culture where your creative bug gets activated.

 

If this sounds like a company you would like to be a part of, and a role you would thrive in, please don’t hold back from applying! We need your unique perspective for our continued innovation and success!

So let’s converse! Our inquisitive nature is all keen to know more about you.

Skills

JAVA, MONGO, Redis, Cassandra, Kafka, rabbitMQ


 

Read more
Neo Aid
Nandini Sharma
Posted by Nandini Sharma
Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹15L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Microservices
+9 more

B.Tech./ BE - Computer, IT, Electronics only

Requirements:

  • 3+ year of experience in Development in JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands on experience on Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem solving

Skills:

  • 3+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem solving skills.
  • Ability to work in a fast paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding of AI/ML algorithms is a plus.
  • Java
    Agile and Kafka
    Microservices
    Springboot
    NoSQL/MongoDB
    Scrum
Read more
Bengaluru (Bangalore), Hyderabad, Pune, Chennai, Jaipur
10 - 14 yrs
₹1L - ₹15L / yr
Ant
Maven
CI/CD
Jenkins
GitHub
+16 more

DevOps Architect 

Experience:  10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.

Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired

Technical Skillset: Skills Proficiency level

  • Build tools (Ant or Maven) - Expert
  • CI/CD tool (Jenkins or Github CI/CD) - Expert
  • Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
  • Infrastructure As Code (Terraform, Helm charts etc.) - Expert
  • Containerization (Docker, Docker Registry) - Expert
  • Scripting (linux) - Expert
  • Cluster deployment (Kubernetes) & maintenance - Expert
  • Programming (Java) - Intermediate
  • Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
  • Artifactory (JFrog) - Expert
  • Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
  • Ansible, MySQL, PostgreSQL - Intermediate


• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)

Roles and Responsibilities

• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Read more
CricStox Private Limited

at CricStox Private Limited

3 recruiters
Ishwar Sharma
Posted by Ishwar Sharma
Pune
2 - 3 yrs
₹8L - ₹10L / yr
NodeJS (Node.js)
Microservices
Kubernetes
Docker
Amazon Web Services (AWS)
+1 more
Backend Cloud Engineer @ CricStox
CricStox is a Pune startup building a trading solution in the realm of gametech x fintech.
We intend to build a sport-agnostic platform to allow trading in stocks of sportspersons under any sport
through our mobile & web-based applications.
We’re currently hiring a Backend Cloud Engineer who will gather, refine specifications and requirements
based on technical needs and implement the same by using best software development practices.
Responsibilities?
● Mainly, but not limited to maintaining, expanding, and scaling our microservices/ app/ site.
● Integrate data from various back-end services and databases.
● Always be plugged into emerging technologies/industry trends and apply them into operations and
activities.
● Comfortably work and thrive in a fast-paced environment, learn rapidly and master diverse web
technologies and techniques.
● Juggle multiple tasks within the constraints of timelines and budgets with business acumen.
What skills do I need?
● Excellent programming skills in Javascript or Typescript.
● Excellent programming skills in Nodejs with Nestjs framework or equivalent.
● A solid understanding of how web applications work including security, session management, and
best development practices.
● Good working knowledge and experience of how AWS cloud infrastructure works including services
like APIGateway, Cognito, S3, EC2, RDS, SNS, MSK, EKS is a MUST.
● Solid understanding of distributed event streaming technologies like Kafka is a MUST.
● Solid understanding of microservices communication using Saga Design pattern is a MUST.
● Adequate knowledge of database systems, OOPs and web application development.
● Adequate knowledge to create well-designed, testable, efficient APIs using tools like Swagger (or
equivalent).
● Good functional understanding of ORMs like Prisma (or equivalent).
● Good functional understanding of containerising applications using Docker.
● Good functional understanding of how a distributed microservice architecture works.
● Basic understanding of setting up Github CI/CD pipeline to automate Docker images building,
pushing to AWS ECR & deploying to the cluster.
● Proficient understanding of code versioning tools, such as Git (or equivalent).
● Hands-on experience with network diagnostics, monitoring and network analytics tools.
● Aggressive problem diagnosis and creative problem-solving skills.
Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
Symblai

at Symblai

1 recruiter
Vaishali M
Posted by Vaishali M
anywhere, Pune, Hyderabad, Bengaluru (Bangalore)
8 - 15 yrs
₹15L - ₹35L / yr
Java
NodeJS (Node.js)
AngularJS (1.x)
Python
MongoDB
+8 more

Software Architect

Symbl is hiring a Software Architect who is passionate about leading cross-functional R&D teams. This role will serve as the Architect across the global organization driving product architecture, reducing information silos across the org to improve decision making, and coordinating with other engineering teams to ensure seamless integration with other Symbl services.

*Symbl is seeking for a leader with a demonstrated track record of leading cross-functional dev team, you are fit for the role if *

  • You have a track record of designing and building large-scale, cloud-based, highly available software platforms.
  • You have 8+ years of experience in software development with 2+ years in an architect role.
  • You have experience working on customer-facing machine learning implementations (predictions, recommendations, anomaly detection)
  • You are an API first developer who understands the power of platforms.
  • You are passionate about enabling other developers through your leadership and driving teams to efficient decisions.
  • You have the ability to balance long-term objectives with urgent short-term needs
  • You can successfully lead teams through very challenging engineering problems.
  • You are domain Expertise in one or more of: Data pipelines and workflow, telephony systems, real time audio and video streaming machine learning.
  • You have bachelor's degree in a computer science-related field is a minimum requirement
  • You’ll bring your deep experience with distributed systems and platform engineering principles to the table.
  • You are passionate about operational excellence and know-how to build software that delivers it.
  • You are able to think at scale, define, and meet stringent availability and performance SLAs while ensuring quality and resiliency challenges across our diverse product and tech stacks are addressed with NodeJs as mandatory, Java, Python, Javascript, ReactJS with intersection with ML platform + open source DBs.
  • You understand end-user use cases and are driven to design optimal software that meets business needs.

Your day would look like:

  • Work with your team providing engineering leadership and ensuring your resources are solving the most critical engineering problems while ensuring your products are scalable, performant, and highly available.
  • Focused on delivering the highest quality of services, and you support your team as they push production code that impacts hundreds of Symbl customers.
  • Spent time with engineering managers and developers to create and deliver critical new products and/or features that empower them to introduce change with quality and speed.
  • Made sure to connect with your team, both local and remote, to ensure they are delivering on engineering and operational excellence.
  •  

*Job Location : Anywhere  –  Currently WFH due to COVID

Compensation, Perks, and Differentiators:

  • Healthcare
  • Unlimited PTO
  • Paid sick days
  • Paid holidays
  • Flexi working
  • Continuing education
  • Equity and performance-based pay options
  • Rewards & Recognition
  • As our company evolves, so do our benefits. We’re actively innovating how we support our employees.
Read more
Remote, Bengaluru (Bangalore), Pune
7 - 10 yrs
₹40L - ₹55L / yr
Java
J2EE
Spring Boot
Microservices
Algorithms
+4 more
Required qualifications and must have skills
 BE/BTech/BS or equivalent
 7+ years of experience in Java and Spring Boot
 Strong fundamentals in data structure, algorithm, and object-oriented programming
 4+ years of hands-on experience in designing, developing, and delivering large-scale (distributed) system
architecture with complex software design, high scalability and availability.
 Extensive experience with technical leadership, defining visions/solutions and collaborating/driving to see
them to completion.
 Excellent analytical and problem-solving skills
 Experience with any RDBMS and strong SQL knowledge
 Comfortable with Unix / Linux command line

Nice to have Skills
 Experience with Big Data platforms like Hadoop / Hive / Presto
 Experience with ML/AI frameworks like TensorFlow, H20, etc
 Used Key Value stores or noSQL databases
 Good understanding of docker and container platforms like Mesos and Kubernetes
 Security-first architecture approach
 Application benchmarking and optimization
Read more
PayU

at PayU

1 video
6 recruiters
Md Amim
Posted by Md Amim
Bengaluru (Bangalore), Gurugram, Pune
4 - 10 yrs
₹10L - ₹30L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
NOSQL Databases
+9 more

 


Role: Senior Software Engineer – Backend                                           
Location: Bangalore / Gurgaon / Pune

About the Role

The successful backend engineer will work closely and collaboratively with cross functional teams during all phases of the software development lifecycle.

 

The incumbent should be competent to provide quick solutions to problems and taking a product or a product’s component through the entire life cycle, optimize the space / time complexity and improve on usability and reliability.

 

What you’ll be doing:

  • Bachelor's degree in Computer Science or related field from top notch colleges 
  • 4 + years of software development engineering.
  • Understanding of fundamental design principles (including MVC).
  • Good hands on in AWS scalable environment.
  • Experience with different RDBMS and No SQL databases like MySQL, mongo, etc.
  • Experience in designing scalable micro-services required.
  • Strong knowledge of CS fundamentals including data structures, algorithm design and complexity analysis.
  • Proficiency in one language that emphasizes class abstractions (for e.g. Java) and have coded in it for at least 4 years.
  • Excellent communication, analytical and problem solving skills.
  • Strong organizational skills and the ability to prioritize and work with clients with great efficiency.
  • Excellent written and oral communication and presentation skills and the ability to express thoughts logically and succinctly.
  • Open minded, Team builder, Good communicator and ability to lead and inspire teams.
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
  • Experience in dealing with ambiguous/undefined problems; ability to think abstractly

 


What are we looking for?

 

  • 4 to 10 years of hands on design / development experience. 
  • Tech / M.Tech Computer Science or equivalent field from a premier institutes. 
  • Proficient in Java OR C/C++, data structures, algorithms and OO design / design patterns. 
  • Strong understanding of Computer Science fundamentals. 
  • Technical depth in OS, computer architecture and OS internals. 
  • Ability to write scalable and maintainable code. 
  • Self-starter and goal-oriented with strong analytical and problem-solving skills. 
  • Must be able to work cooperatively within a strong diverse technical community to expedite development tasks.
  • Experience in Machine Learning is a plus

PS: Code review and team leading expereince is a pluys for Tech Lead role

 
What we offer

  • Competitive salary and excellent benefits, in a diverse working environment with inspiring and hardworking colleagues
  • A positive, get-things-done workplace.
  • An inclusive environment that ensures we listen to a diverse range of voices when making decisions.
  • Ability to learn cutting edge concepts and innovation in an agile start-up environment with a global scale.
  • A flexible working environment where you can drive your outcomes.

 

About us

At PayU, we are a global fintech investor and our vision is to build a world without financial borders where everyone can prosper. We give people in high growth markets the financial services and products they need to thrive. Our expertise in 18+ high-growth markets enables us to extend the reach of financial services. This drives everything we do, from investing in technology entrepreneurs to offering credit to underserved individuals, to helping merchants buy, sell, and operate online. Being part of Prosus, one of the largest technology investors in the world, gives us the presence and expertise to make a real impact. Find out more at www.payu.com

Our Commitment to Building A Diverse and Inclusive Workforce

As a global and multi-cultural organization with varied ethnicities thriving across locations, we realize that our responsibility towards fulfilling the D&I commitment is huge. Therefore, we continuously strive to create a diverse, inclusive, and safe environment, for all our people, communities, and customers. Our leaders are committed to create an inclusive work culture which enables transparency, flexibility, and unbiased attention to every PayUneer so they can succeed, irrespective of gender, color, or personal faith. An environment where every person feels they belong, that they are listened to, and where they are empowered to speak up. At PayU we have zero tolerance towards any form of prejudice whether a specific race, ethnicity, or of persons with disabilities, or the LGBTQ communities.

 

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune, Hyderabad
3 - 12 yrs
₹5L - ₹25L / yr
Apache Kafka
Big Data
Hadoop
Apache Hive
Java
+1 more

Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.

 

Must Have Skills

  • Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
  • Leading in the identification, isolation, resolution and communication of problems within the production environment.
  • Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
  • Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
  • Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
  • Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
  • Works on multiple platforms and multiple projects concurrently.
  • Performs code and unit testing for complex scope modules, and projects
  • Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
  • Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
  • Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
  • Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector,  JMS source connectors, Tasks, Workers, converters, Transforms.
  • Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
  • Working knowledge on Kafka Rest proxy.
  • Ensure optimum performance, high availability and stability of solutions.
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
  • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.  Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem. 
  • Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
  • Ability to perform data related benchmarking, performance analysis and tuning.
  • Strong skills in In-memory applications, Database Design, Data Integration.
Read more
Service based company
Pune
6 - 12 yrs
₹6L - ₹28L / yr
Big Data
Apache Kafka
Data engineering
Cassandra
Java
+1 more

Primary responsibilities:

  • Architect, Design and Build high performance Search systems for personalization, optimization, and targeting
  • Designing systems with Solr, Akka, Cassandra, Kafka
  • Algorithmic development with primary focus Machine Learning
  • Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments
  • Participation in design and code reviews and recommend improvements
  • Unit testing with JUnit, Performance testing and tuning
  • Coordination with internal and external teams
  • Mentoring junior engineers
  • Participate in Product roadmap and Prioritization discussions and decisions
  • Evangelize the solution with Professional services and Customer Success teams

 

Read more
DelaPlex Software

at DelaPlex Software

2 recruiters
Sunil Kandukuri
Posted by Sunil Kandukuri
Pune, Nagpur, Bengaluru (Bangalore), Hyderabad
4 - 7 yrs
₹4L - ₹8L / yr
Java
Spring
Spring Boot
NOSQL Databases
DynamoDB
+4 more

Role: Java developer
Experience: 4+ years

Job description

○ Working experience on JAVA,Spring Boot. (on building web services?)

○ NOSQL DynamoDB knowledge is plus

○ Working experience in building micro services and distributed systems

○ Working experience on using messaging queues RabbitMQ/Kafka is plus

Read more
Cognologix Technologies

at Cognologix Technologies

14 recruiters
Priyal Wagh
Posted by Priyal Wagh
Remote, Pune
4 - 9 yrs
₹10L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more

You will work on: 

 

We help many of our clients make sense of their large investments in data – be it building analytics solutions or machine learning applications. You will work on cutting-edge cloud-native technologies to crunch terabytes of data into meaningful insights. 

 

What you will do (Responsibilities):

 

Collaborate with product management & engineering to build highly efficient data pipelines. 

You will be responsible for:

 

  • Dealing with large customer data and building highly efficient pipelines
  • Building insights dashboards
  • Troubleshooting data loss, data inconsistency, and other data-related issues
  • Product development environment delivering stories in a scaled agile delivery methodology.

 

What you bring (Skills):

 

5+ years of experience in hands-on data engineering & large-scale distributed applications

 

  • Extensive experience in object-oriented programming languages such as Java or Scala
  • Extensive experience in RDBMS such as MySQL, Oracle, SQLServer, etc.
  • Experience in functional programming languages such as JavaScript, Scala, or Python
  • Experience in developing and deploying applications in Linux OS
  • Experience in big data processing technologies such as Hadoop, Spark, Kafka, Databricks, etc.
  • Experience in Cloud-based services such as Amazon AWS, Microsoft Azure, or Google Cloud Platform
  • Experience with Scrum and/or other Agile development processes
  • Strong analytical and problem-solving skills

 

Great if you know (Skills):

 

  • Some exposure to containerization technologies such as Docker, Kubernetes, or Amazon ECS/EKS
  • Some exposure to microservices frameworks such as Spring Boot, Eclipse Vert.x, etc.
  • Some exposure to NoSQL data stores such as Couchbase, Solr, etc.
  • Some exposure to Perl, or shell scripting.
  • Ability to lead R&D and POC efforts
  • Ability to learn new technologies on his/her own
  • Team player with self-drive to work independently
  • Strong communication and interpersonal skills

Advantage Cognologix:

  •  A higher degree of autonomy, startup culture & small teams
  •  Opportunities to become an expert in emerging technologies
  •  Remote working options for the right maturity level
  •  Competitive salary & family benefits
  •  Performance-based career advancement


About Cognologix: 

 

Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business-first approach to help meet our client’s strategic goals.

We are a Data focused organization helping our clients to deliver their next generation of products in the most efficient, modern, and cloud-native way.

Benefits Working With Us:

  • Health & Wellbeing
  • Learn & Grow
  • Evangelize 
  • Celebrate Achievements
  • Financial Wellbeing
  • Medical and Accidental cover.
  • Flexible Working Hours.
  • Sports Club & much more.
Read more
Media.net

at Media.net

21 recruiters
Agency job
via Volks Consulting by SHUBHAM MAGDUM
Remote, Bengaluru (Bangalore), Pune, Mumbai
2 - 6 yrs
₹20L - ₹45L / yr
Java
Spring
Data Structures
Algorithms
Apache Kafka
+4 more
  •  2 - 6 years of software development experience
  •  Good grasp on programming fundamentals including OOP, Design Patterns and Data Structures
  •  Excellent analytical, logical and problem-solving skills
  • Software Development Engineer
  • Good understanding of complexities involved in designing/developing large scale systems
  • Strong system design skills
  •  Experience in technologies like Elasticsearch, Redis, Kafka etc
  • Good knowledge of relational and NoSQL databases
  • Familiarity with common machine learning algorithms. In-depth knowledge is a plus
  • Experience of working with big data technologies like Hadoop, Spark, Hive is a big plus
  • Ability to understand business requirements and take ownership of the work
  • Exhibit passion and enthusiasm for building and maintaining large scale platforms
Read more
Remote, Bengaluru (Bangalore), Chennai, Hyderabad, Mumbai, Pune
3 - 8 yrs
₹5L - ₹17L / yr
Java
Spring Boot
Apache Kafka
MySQL
java
+1 more

Software Development Engineer:

Major Responsibilities:

  • Translation of complex functional requirements into technical requirements, implementing and maintaining a coherent and progressive development strategy for our product line
  • Design, develop and maintain complex systems using best of the breed development practices and technology.
  • Responsible for the over-all software development life cycle.
  • Delivery of High Quality, Scalable and Extensible systems and applications on-time and on-budget.
  • Adoption and Evolution of the software engineering practices and tools within the organization
  • Keep in sync with the latest technology developments and open source offerings. Evaluate and adopt them for solving business problem of organization.
  • Collaborate with other technology and business teams within the organization to provide efficient robust solutions to the problems.
  • Drive and manage the bug triage process
  • Report on status of product delivery and quality to management, customer support and product teams.

Desired Skills

  • Strong programming, debugging, and problem-solving skills
  • Strong understanding of data structures and algorithms
  • Sound understanding of object-oriented programming and excellent software design skills.
  • Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / JavaSpringboot applications (API’s).
  • Strong understanding of database design and SQL (mySql/mariaDB) development
  • Good to have knowledge of NoSQL technologies like MongoDB, Solr, Redis, Cassandra or any other NoSQL database
  • Knowledge of design patterns and good to have experience of large-scale applications
  • Should have experience in Apache Kafka, RabbitMQ or other Queueing systems.

Ideal Experience

  • 3 to 8 years of industry experience.
  • Bachelors or Master’s Degree in Computer Science/ IT
  • Drive discussions to create/improve product, process and technology
  • Provide end to end solution and design details
  • Lead development of formalized solution methodologies
  • Passion to work in startup like environment

Personal Characteristics

  • Passion and commitment
  • Strong and excellent software design intellect
  • High integrity
  • Self-starter
Read more
Maveric Systems

at Maveric Systems

3 recruiters
Rashmi Poovaiah
Posted by Rashmi Poovaiah
Bengaluru (Bangalore), Chennai, Pune
4 - 10 yrs
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
Clairvoyant India Private Limited
Taruna Roy
Posted by Taruna Roy
Remote, Pune
3 - 8 yrs
₹4L - ₹15L / yr
Big Data
Hadoop
Java
Spark
Hibernate (Java)
+5 more
ob Title/Designation:
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
  • 4-10 years of experience in software development.
  • At least 2 years of relevant work experience on large scale Data applications.
  • Strong coding experience in Java is mandatory
  • Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
  • Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
  • Should have good working experience on
  • o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
  • o Kafka
  • o J2EE Frameworks (Spring/Hibernate/REST)
  • o Spark Streaming or any other streaming technology.
  • Strong coding experience in Java is mandatory
  • Ability to work on the sprint stories to completion along with Unit test case coverage.
  • Experience working in Agile Methodology
  • Excellent communication and coordination skills
  • Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
  • Must be able to integrate quickly into the team and work independently towards team goals
Role & Responsibilities:
  • Take the complete responsibility of the sprint stories' execution
  • Be accountable for the delivery of the tasks in the defined timelines with good quality.
  • Follow the processes for project execution and delivery.
  • Follow agile methodology
  • Work with the team lead closely and contribute to the smooth delivery of the project.
  • Understand/define the architecture and discuss the pros-cons of the same with the team
  • Involve in the brainstorming sessions and suggest improvements in the architecture/design.
  • Work with other team leads to get the architecture/design reviewed.
  • Work with the clients and counter-parts (in US) of the project.
  • Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Education: BE/B.Tech from reputed institute.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
Read more
Sapper.ai

at Sapper.ai

2 recruiters
Amol K
Posted by Amol K
Pune
3 - 7 yrs
Best in industry
Spring Boot
Java
J2EE
Spring
Spring Batch
+3 more
Sapper.AI is building the next generation intelligent automation software. We are a young startup and if you are looking at exciting work, long hours and lot of learning, have a passion to create new innovation and go-getter attitude, this is the place to be.

Built on a foundation of AI we are automating enterprise application integration, data integration, data preparation for Analytics and bot automation. We are looking to build our engineering development center in Pune with passionate and entrepreneurial developers at all levels (Interns, Fresh Graduates, Senior Software Engineers and Architects).

Expectations -
  • Have at least 3 years work experience in Java 8 or higher / J2EE Java development.
  • Have experience of agile systems development methodologies such as SCRUM
  • Experience in designing the solution and implementation.
  • Is a communicative, positive, outgoing and driven team player.
  • Solution-oriented, see opportunities and proactively proposing new solutions, speak and write fluently in English.
  • Good to have certifications in Java, Spring etc.
  • Experience in Java 8.
  • Experience in Spring Boot and other spring framework like Spring data, AOP etc.
  • Experience in MongoDB/ Kafka / RabbitMQ etc.
  • Experience in REST API
  • Have worked on microservices
  • Should have worked on minimum 2- 3 projects
  • Experience in writing effective Unit test case for better coverage.
  • Experience in writing good quality code by following code quality tools likes SonarQube etc.
Read more
Sapper.ai

at Sapper.ai

2 recruiters
Amol K
Posted by Amol K
Pune
8 - 16 yrs
Best in industry
Java
Apache Kafka
Spring Boot
Technical Architecture
Apache Camel
+6 more
Sapper.AI is building the next generation intelligent automation software. We are a young startup and if you are looking at exciting work, long hours and lot of learning, have a passion to create new innovation and go-getter attitude, this is the place to be.

Built on a foundation of AI we are automating enterprise application integration, data integration, data preparation for Analytics and bot automation. We are looking to build our engineering development center in Pune with passionate and entrepreneurial developers at all levels (Interns, Fresh Graduates, Senior Software Engineers and Architects).


As a Architect/Technology Lead you will be involved in design and development of enterprise automation. Knowledge of building workflow engines, microservices design patterns, experience with large scale enterprise architectures, springboot, kafka, data management and caching is needed. At a startup you will be wearing multiple hats engineering, presales, talking to customers, setting up operational processes.
Read more
Aikon Labs Private Limited
Pune
2 - 8 yrs
₹3L - ₹12L / yr
Java
Product development
RESTful APIs
Spring
Hibernate (Java)
+4 more
Do you have a passion to be a part of an innovative startup? Here’s an opportunity for you - become an active member of our core platform development team.
Main Duties
Contribute in all phases of the development lifecycle
Write well designed, testable, efficient code
Ensure designs are in compliance with specifications
Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
Prepare and produce releases of software components

Role & Relationships
We consider ourselves a team & you will be a valuable part of it. You could be reporting to a Senior member or directly to our Founder, CEO
Educational Qualifications
We don’t discriminate. As long as you have the required skill set & the right attitude
Experience
Upto seven years of experience, preferably working on Java. 
Skills
Good
Strong understanding of Core Java , Servlet, JSP
Knowledge of RDBMS (MySQL, Oracle, SQL Server), NoSQL
Knowledge of RESTful Web Services, XML, JSON, Spring
Good team player

Even better
Familiarity with the software development lifecycle
Strong Full stack developer development background with frontend and backend web applications
Competencies
An aptitude to solve problems & learn something new
Highly self-motivated
Analytical frame of mind
Ability to work in fast-paced, dynamic environment

Location
Currently in Pune
Remuneration
Once we meet, we shall make an offer depending on how good a fit you are & the experience you already have
About us
Aikon Labs Pvt Ltd is a start-up focused on Realizing Ideas. One such idea is iEngage.io, our Intelligent Engagement Platform. We leverage Augmented Intelligence, a combination of machine-driven insights & human understanding, to serve a timely response to every interaction from the people you care about.
Read more
3409 Tech Ventures Pvt. Ltd.
Pune
4 - 7 yrs
₹8L - ₹14L / yr
Java
Spring
Microservices
Apache Kafka
Message Queuing Telemetry Transport (MQTT)
+2 more
We are looking for an experienced Java developer who will help us build scalable REST API based backend using Microservices. Key skills - Own the product functionality and work with the technical and product leadership to convert ideas into great product - Stay abreast of latest back-end technologies and patterns and proactively find ways to apply them to the business problem - Thorough understanding of core Java, Spring framework - Experience with Spring Boot to bootstrap applications - Good understanding and working experience with RESTful web services - Knowledge of distributed systems and how they are different from traditional monolith applications You get additional brownie points if you have - Knowledge of modern authorization mechanisms, such as JSON Web Token and OAuth2 - Familiarity with code versioning tools such as Git etc - Self-starter who can think outside of the box, and come up with a solution to resolve and mitigate complex problems - Experience working in Agile development environment using methodologies like Scrum and tools like JIRA, Confluence etc Experience - 4-7 years of work experience developing Java based backend applications - Around 1 year of work experience e using Spring Boot, Spring Cloud and Microservices - BE/B Tech or higher preferably in Computer Science About Us QUp is a leading healthcare product that is excited to offer a “Painless OPD” experience to patients and health care providers like doctors, hospitals etc. We are a fast growing startup that is using innovation and cutting edge technologies to solve the OPD management problem. We offer competitive salary, freedom to explore cutting edge tools & technologies, flat hierarchy & open communication channels to our people so that they continue to be growth drivers for the company.
Read more
Mooshak

at Mooshak

1 recruiter
Anurag Gaur
Posted by Anurag Gaur
Pune
1 - 5 yrs
₹6L - ₹10L / yr
NodeJS (Node.js)
NOSQL Databases
Java
Apache Storm
Apache Kafka
+1 more
ABOUT MOOSHAK We're at a point where the urban English-speaking Indian population is almost all online.The next billion Indians online all communicate via Indian languages. Mooshak was created with the singular aim of making the Internet fun and relevant for this large, untapped segment. At Mooshak, we want to connect and engage Indians in their own language. And that presents problems in various domains, from creativity in content creation, to creating a highly scalable platform, to applying techniques in AI and NLP in Indian languages to understand what people are saying and react to what they want. Mooshak is architected to scale. Irrespective of the number of followers, the read time for a feed remains constant. We achieve this by using distributed message queues and a distributed computing engine and some nifty caching! TECHNICAL RESPONSIBILITIES Mooshak’s Tech Stack Java Node.js Mongo DB Redis Apache Kafka & Apache Storm Nginx / Jenkins Server Developer’s Roles and Responsibilities You are expected to know at least 4 of these technologies with the ability to quickly learn the others. You will play the leading role in all stages of server development Architecture Coding Final testing Shipping The APIs are written and the product works fine. You are expected to understand the architecture and enhance product functionality. Sometimes you may be required to double up as the Dev Ops guy should the servers fail or the product not be working as expected. The core APIs are written in Node.js The distributed message queue (Kafka) and compute engine (Storm) are implemented in Java. Understanding of Angular 2 is a big plus as our Web app is built on the same. NON TECHNICAL RESPONSIBILITIES We are a startup. This means that: You will be expected to be someone who comes up with solutions instead of problems. You will be expected to work non stop including weekends if the servers crash. But otherwise we are quite chill! You will be expected to talk to multiple stakeholders customers, designer, client side developer to achieve user and business needs. A high aptitude and a positive attitude are a must You should be comfortable working independently as well as in a team. We are a lean team right now, with you as the only server developer (assisted by the folks who built the platform) JOB LOCATION You would be working out of our office in Pune. You may be required to travel occasionally to Mumbai or Bangalore to interact with some other team members.
Read more
Securonix

at Securonix

1 recruiter
Ramakrishna Murthy
Posted by Ramakrishna Murthy
Pune
3 - 7 yrs
₹10L - ₹15L / yr
HDFS
Apache Flume
Apache HBase
Hadoop
Impala
+3 more
Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort