Apache Kafka Jobs in Pune

Explore top Apache Kafka Job opportunities in Pune from Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.
icon
DP
Posted by Shifat S
icon
Bengaluru (Bangalore), Pune, Chennai, Gurugram
icon
3 - 6 yrs
icon
₹10L - ₹20L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
Spring MVC
+7 more

Required Education:

 

B.Tech./ BE - Computer, IT, Electronics only

Required Skills:

 

  • 3+ years of experience in the Development of JAVA technology.
  • Strong Java Basics
  • SpringBoot or Spring MVC
  • Hands-on experience on Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
  • Proficient in REST API development
  • Messaging Queue (RabitMQ or Kafka)
  • Microservices
  • Any Caching Mechanism
  • Good at problem-solving

 

Good to Have Skills:

 

  • 4+ years of experience in using Java/J2EE tech stacks
  • Good understanding of data structures and algorithms.
  • Excellent analytical and problem-solving skills.
  • Ability to work in a fast-paced internet start-up environment.
  • Experience in technical mentorship/coaching is highly desirable.
  • Understanding AI/ML algorithms is a plus.
Read more
DP
Posted by Indrajeet Deshmukh
icon
Pune
icon
2 - 5 yrs
icon
₹15L - ₹20L / yr
MongoDB
Big Data
Apache Kafka
Spring MVC
Spark
+3 more

What You’ll Do:

  • Ensure timely and top-quality product delivery
  • Ensure that the end product is fully and correctly defined and documented
  • Ensure implementation/continuous improvement of formal processes to support product development activities
  • Drive the architecture/design decisions needed to achieve cost-effective and high-performance results
  • Conduct feasibility analysis, produce functional and design specifications of proposed new features.
  • Provide helpful and productive code reviews for peers and junior members of the team.
  • Troubleshoot complex issues discovered in-house as well as in customer environments.

Who You Are:

  • Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc.
  • Expertise in Java, Object Oriented Programming, Design Patterns
  • Experience in coding and implementing scalable solutions in a large-scale distributed environment
  • Working experience in a Linux/UNIX environment is good to have
  • Experience with relational databases and database concepts, preferably MySQL
  • Experience with SQL and Java optimization for real-time systems
  • Familiarity with version control systems Git and build tools like Maven
  • Excellent interpersonal, written, and verbal communication skills
  • BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent

The set of skills we are looking for:

  • MongoDB
  • Big Data
  • Apache Kafka 
  • Spring MVC 
  • Spark 
  • Java 
Read more
DP
Posted by Indrajeet Deshmukh
icon
Pune
icon
2 - 10 yrs
icon
₹22L - ₹28L / yr
Java
MySQL
MongoDB
Big Data
Apache Kafka
+2 more

What You’ll Do:

  • Ensure timely and top-quality product delivery
  • Ensure that the end product is fully and correctly defined and documented
  • Ensure implementation/continuous improvement of formal processes to support product development activities
  • Drive the architecture/design decisions needed to achieve cost-effective and high-performance results
  • Conduct feasibility analysis, produce functional and design specifications of proposed new features.
  • Provide helpful and productive code reviews for peers and junior members of the team.
  • Troubleshoot complex issues discovered in-house as well as in customer environments.

Who You Are:

  • Strong computer science fundamentals in algorithms, data structures, databases, operating systems, etc.
  • Expertise in Java, Object Oriented Programming, Design Patterns
  • Experience in coding and implementing scalable solutions in a large-scale distributed environment
  • Working experience in a Linux/UNIX environment is good to have
  • Experience with relational databases and database concepts, preferably MySQL
  • Experience with SQL and Java optimization for real-time systems
  • Familiarity with version control systems Git and build tools like Maven
  • Excellent interpersonal, written, and verbal communication skills
  • BE/B.Tech./M.Sc./MCS/MCA in Computers or equivalent

The set of skills we are looking for:

  • MongoDB
  • Big Data
  • Apache Kafka 
  • Spring MVC 
  • Spark 
  • Java 
Read more
icon
Pune
icon
2 - 3 yrs
icon
₹8L - ₹10L / yr
NodeJS (Node.js)
Microservices
Kubernetes
Docker
Amazon Web Services (AWS)
+1 more
Backend Cloud Engineer @ CricStox
CricStox is a Pune startup building a trading solution in the realm of gametech x fintech.
We intend to build a sport-agnostic platform to allow trading in stocks of sportspersons under any sport
through our mobile & web-based applications.
We’re currently hiring a Backend Cloud Engineer who will gather, refine specifications and requirements
based on technical needs and implement the same by using best software development practices.
Responsibilities?
● Mainly, but not limited to maintaining, expanding, and scaling our microservices/ app/ site.
● Integrate data from various back-end services and databases.
● Always be plugged into emerging technologies/industry trends and apply them into operations and
activities.
● Comfortably work and thrive in a fast-paced environment, learn rapidly and master diverse web
technologies and techniques.
● Juggle multiple tasks within the constraints of timelines and budgets with business acumen.
What skills do I need?
● Excellent programming skills in Javascript or Typescript.
● Excellent programming skills in Nodejs with Nestjs framework or equivalent.
● A solid understanding of how web applications work including security, session management, and
best development practices.
● Good working knowledge and experience of how AWS cloud infrastructure works including services
like APIGateway, Cognito, S3, EC2, RDS, SNS, MSK, EKS is a MUST.
● Solid understanding of distributed event streaming technologies like Kafka is a MUST.
● Solid understanding of microservices communication using Saga Design pattern is a MUST.
● Adequate knowledge of database systems, OOPs and web application development.
● Adequate knowledge to create well-designed, testable, efficient APIs using tools like Swagger (or
equivalent).
● Good functional understanding of ORMs like Prisma (or equivalent).
● Good functional understanding of containerising applications using Docker.
● Good functional understanding of how a distributed microservice architecture works.
● Basic understanding of setting up Github CI/CD pipeline to automate Docker images building,
pushing to AWS ECR & deploying to the cluster.
● Proficient understanding of code versioning tools, such as Git (or equivalent).
● Hands-on experience with network diagnostics, monitoring and network analytics tools.
● Aggressive problem diagnosis and creative problem-solving skills.
Read more
icon
Pune, Bengaluru (Bangalore), Coimbatore, Hyderabad, Gurugram
icon
3 - 10 yrs
icon
₹18L - ₹40L / yr
Apache Kafka
Spark
Hadoop
Apache Hive
Big Data
+5 more

Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.



You’ll spend time on the following:

  • You will partner with teammates to create complex data processing pipelines in order to solve our clients’ most ambitious challenges
  • You will collaborate with Data Scientists in order to design scalable implementations of their models
  • You will pair to write clean and iterative code based on TDD
  • Leverage various continuous delivery practices to deploy data pipelines
  • Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
  • Develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
  • Create data models and speak to the tradeoffs of different modeling approaches

Here’s what we’re looking for:

 

  • You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
  • You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
  • Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
  • You are comfortable taking data-driven approaches and applying data security strategy to solve business problems 
  • Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
  • Strong communication and client-facing skills with the ability to work in a consulting environment
Read more
DP
Posted by Amala Baby
icon
Pune
icon
2 - 4 yrs
icon
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
icon
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune
icon
9 - 13 yrs
icon
Best in industry
Java
J2EE
Spring Boot
Hibernate (Java)
NOSQL Databases
+8 more

Engineering Manager - Backend


About Us:

Paytm is India’s leading digital payments and financial services company, which is focused on driving consumers and merchants to its platform by offering them a variety of payment use cases. Paytm provides consumers with services like utility payments and money transfers, while empowering them to pay via Paytm Payment Instruments (PPI) like Paytm Wallet, Paytm UPI, Paytm Payments Bank Netbanking, Paytm FASTag and Paytm Postpaid - Buy Now, Pay Later. To merchants, Paytm offers acquiring devices like Soundbox, EDC, QR and Payment Gateway where payment aggregation is done through PPI and also other banks’ financial instruments. To further enhance merchants’ business, Paytm offers merchants commerce services through advertising and Paytm Mini app store. 


About the role:

As an Engineering Manager, you will be developing the detailed design structure, implementing the best practices and coding standards, leading a team of developers for successful delivery of the project. You will be working on design, architecture and hands-on coding.

Requirements:

  • 9 to 13 years in Technical development with 4+ years in Providing technical leadership for high performance teams
  • Work closely with business and product teams to understand the requirements, drive design, architecture and influence the choice of technology to deliver solutions working closely with architects and leadership team.
  • Build robust, scalable, highly available and reliable systems using Micro Services Architecture based on Java, Spring boot
  • Improve Engineering and Operational Excellence by identifying and building the right solutions for observability and manageability
  • Keep the tech stack current with the goal to optimize for scale, cost and performance
  • Migrate workloads to public cloud
  • Attitude to thrive in a fun, fast-paced environment
  • Serve as a thought leader and mentor on technical, architectural, design and related issues.
  • Proactively identify architectural weaknesses and recommend appropriate solutions.

Preferred Qualification : Bachelor's/Master's Degree in Computer Science or equivalent


Skills that will help you succeed in this role:

  • Tech Stack: Lang: Java, DB: RDBMS, Messaging: Kafka/RabbitMQ, Caching: Redis/Aerospike, Micro services, AWS
  • Strong experience in scaling, performance tuning & optimization at both API and storage layers, system designs-HLD AND LLD
  • Hands-on leader, and problem solver with a passion for excellence.

Why join us:

  • Because you get an opportunity to make a difference, and have a great time doing that.
  • You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve.
  • You should work with us if you think seriously about what technology can do for people.
  • We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be.

Compensation:

If you are the right fit, we believe in creating wealth for you. With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!


Read more
DP
Posted by Vaishali M
icon
anywhere, Pune, Hyderabad, Bengaluru (Bangalore)
icon
8 - 15 yrs
icon
₹15L - ₹35L / yr
Java
NodeJS (Node.js)
AngularJS (1.x)
Python
MongoDB
+8 more

Software Architect

Symbl is hiring a Software Architect who is passionate about leading cross-functional R&D teams. This role will serve as the Architect across the global organization driving product architecture, reducing information silos across the org to improve decision making, and coordinating with other engineering teams to ensure seamless integration with other Symbl services.

*Symbl is seeking for a leader with a demonstrated track record of leading cross-functional dev team, you are fit for the role if *

  • You have a track record of designing and building large-scale, cloud-based, highly available software platforms.
  • You have 8+ years of experience in software development with 2+ years in an architect role.
  • You have experience working on customer-facing machine learning implementations (predictions, recommendations, anomaly detection)
  • You are an API first developer who understands the power of platforms.
  • You are passionate about enabling other developers through your leadership and driving teams to efficient decisions.
  • You have the ability to balance long-term objectives with urgent short-term needs
  • You can successfully lead teams through very challenging engineering problems.
  • You are domain Expertise in one or more of: Data pipelines and workflow, telephony systems, real time audio and video streaming machine learning.
  • You have bachelor's degree in a computer science-related field is a minimum requirement
  • You’ll bring your deep experience with distributed systems and platform engineering principles to the table.
  • You are passionate about operational excellence and know-how to build software that delivers it.
  • You are able to think at scale, define, and meet stringent availability and performance SLAs while ensuring quality and resiliency challenges across our diverse product and tech stacks are addressed with NodeJs as mandatory, Java, Python, Javascript, ReactJS with intersection with ML platform + open source DBs.
  • You understand end-user use cases and are driven to design optimal software that meets business needs.

Your day would look like:

  • Work with your team providing engineering leadership and ensuring your resources are solving the most critical engineering problems while ensuring your products are scalable, performant, and highly available.
  • Focused on delivering the highest quality of services, and you support your team as they push production code that impacts hundreds of Symbl customers.
  • Spent time with engineering managers and developers to create and deliver critical new products and/or features that empower them to introduce change with quality and speed.
  • Made sure to connect with your team, both local and remote, to ensure they are delivering on engineering and operational excellence.
  •  

*Job Location : Anywhere  –  Currently WFH due to COVID

Compensation, Perks, and Differentiators:

  • Healthcare
  • Unlimited PTO
  • Paid sick days
  • Paid holidays
  • Flexi working
  • Continuing education
  • Equity and performance-based pay options
  • Rewards & Recognition
  • As our company evolves, so do our benefits. We’re actively innovating how we support our employees.
Read more

Leading Sales Enabler

Agency job
via Qrata by Blessy Fernandes
icon
Remote, Bengaluru (Bangalore), Pune
icon
7 - 10 yrs
icon
₹40L - ₹55L / yr
Java
J2EE
Spring Boot
Microservices
Algorithms
+4 more
Required qualifications and must have skills
 BE/BTech/BS or equivalent
 7+ years of experience in Java and Spring Boot
 Strong fundamentals in data structure, algorithm, and object-oriented programming
 4+ years of hands-on experience in designing, developing, and delivering large-scale (distributed) system
architecture with complex software design, high scalability and availability.
 Extensive experience with technical leadership, defining visions/solutions and collaborating/driving to see
them to completion.
 Excellent analytical and problem-solving skills
 Experience with any RDBMS and strong SQL knowledge
 Comfortable with Unix / Linux command line

Nice to have Skills
 Experience with Big Data platforms like Hadoop / Hive / Presto
 Experience with ML/AI frameworks like TensorFlow, H20, etc
 Used Key Value stores or noSQL databases
 Good understanding of docker and container platforms like Mesos and Kubernetes
 Security-first architecture approach
 Application benchmarking and optimization
Read more
icon
Bengaluru (Bangalore), Gurugram, Pune
icon
4 - 10 yrs
icon
₹10L - ₹30L / yr
Java
J2EE
Spring Boot
Hibernate (Java)
NOSQL Databases
+9 more

 


Role: Senior Software Engineer – Backend                                           
Location: Bangalore / Gurgaon / Pune

About the Role

The successful backend engineer will work closely and collaboratively with cross functional teams during all phases of the software development lifecycle.

 

The incumbent should be competent to provide quick solutions to problems and taking a product or a product’s component through the entire life cycle, optimize the space / time complexity and improve on usability and reliability.

 

What you’ll be doing:

  • Bachelor's degree in Computer Science or related field from top notch colleges 
  • 4 + years of software development engineering.
  • Understanding of fundamental design principles (including MVC).
  • Good hands on in AWS scalable environment.
  • Experience with different RDBMS and No SQL databases like MySQL, mongo, etc.
  • Experience in designing scalable micro-services required.
  • Strong knowledge of CS fundamentals including data structures, algorithm design and complexity analysis.
  • Proficiency in one language that emphasizes class abstractions (for e.g. Java) and have coded in it for at least 4 years.
  • Excellent communication, analytical and problem solving skills.
  • Strong organizational skills and the ability to prioritize and work with clients with great efficiency.
  • Excellent written and oral communication and presentation skills and the ability to express thoughts logically and succinctly.
  • Open minded, Team builder, Good communicator and ability to lead and inspire teams.
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
  • Experience in dealing with ambiguous/undefined problems; ability to think abstractly

 


What are we looking for?

 

  • 4 to 10 years of hands on design / development experience. 
  • Tech / M.Tech Computer Science or equivalent field from a premier institutes. 
  • Proficient in Java OR C/C++, data structures, algorithms and OO design / design patterns. 
  • Strong understanding of Computer Science fundamentals. 
  • Technical depth in OS, computer architecture and OS internals. 
  • Ability to write scalable and maintainable code. 
  • Self-starter and goal-oriented with strong analytical and problem-solving skills. 
  • Must be able to work cooperatively within a strong diverse technical community to expedite development tasks.
  • Experience in Machine Learning is a plus

PS: Code review and team leading expereince is a pluys for Tech Lead role

 
What we offer

  • Competitive salary and excellent benefits, in a diverse working environment with inspiring and hardworking colleagues
  • A positive, get-things-done workplace.
  • An inclusive environment that ensures we listen to a diverse range of voices when making decisions.
  • Ability to learn cutting edge concepts and innovation in an agile start-up environment with a global scale.
  • A flexible working environment where you can drive your outcomes.

 

About us

At PayU, we are a global fintech investor and our vision is to build a world without financial borders where everyone can prosper. We give people in high growth markets the financial services and products they need to thrive. Our expertise in 18+ high-growth markets enables us to extend the reach of financial services. This drives everything we do, from investing in technology entrepreneurs to offering credit to underserved individuals, to helping merchants buy, sell, and operate online. Being part of Prosus, one of the largest technology investors in the world, gives us the presence and expertise to make a real impact. Find out more at www.payu.com

Our Commitment to Building A Diverse and Inclusive Workforce

As a global and multi-cultural organization with varied ethnicities thriving across locations, we realize that our responsibility towards fulfilling the D&I commitment is huge. Therefore, we continuously strive to create a diverse, inclusive, and safe environment, for all our people, communities, and customers. Our leaders are committed to create an inclusive work culture which enables transparency, flexibility, and unbiased attention to every PayUneer so they can succeed, irrespective of gender, color, or personal faith. An environment where every person feels they belong, that they are listened to, and where they are empowered to speak up. At PayU we have zero tolerance towards any form of prejudice whether a specific race, ethnicity, or of persons with disabilities, or the LGBTQ communities.

 

Read more
Agency job
via Response Informatics by Swagatika Sahoo
icon
Chennai, Bengaluru (Bangalore), Pune, Mumbai, Hyderabad
icon
3 - 10 yrs
icon
₹10L - ₹24L / yr
PySpark
Python
Amazon Web Services (AWS)
Apache Spark
Glue semantics
+3 more
  • Minimum 1 years of relevant experience, in PySpark (mandatory)
  • Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus 
  • Ability to play lead role and independently manage 3-5 member of Pyspark development team 
  • EMR ,Python and PYspark mandate.
  • Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
Read more

a software product company working on petabyte scale data

Agency job
via RS Consultants by Rahul Inamdar
icon
Pune
icon
7 - 15 yrs
icon
₹30L - ₹50L / yr
Data Science
Data Scientist
Python
Java
Apache Kafka
+6 more

We are looking for an exceptional Data Scientist who is passionate about data and motivated to build large scale machine learning solutions. This person will be contributing to the analytics of data for insight discovery and development of machine learning pipeline to support modelling of terabytes of daily data for various use cases

 

Typical persona: Data Science Manager / Architect

 

Experience: 8+ years programming/engineering experience (with at least last 4 years in big data, Data science)

 

Must have:

  • Hands-on Python: Pandas, Scikit-Learn
  • Working knowledge of Kafka
  • Able to carry out own tasks and help the team in resolving problems - logical or technical (25% of job)
  • Good on analytical & debugging skills
  • Strong communication skills

Desired (in order of priorities):

  • Go (Strong advantage)
  • Airflow (Strong advantage)
  • Familiarity & working experience on more than one type of database: relational, object, columnar, graph and other unstructured databases
  • Data structures, Algorithms
  • Experience with multi-threaded and thread sync concepts
  • AWS Sagemaker
  • Keras
  • Should have strong experience in Python programming minimum 4 Years
Read more

A large software MNC with over 20k employees in India

Agency job
via RS Consultants by Rahul Inamdar
icon
Pune
icon
5 - 12 yrs
icon
₹15L - ₹22L / yr
Spark
Data engineering
Data Engineer
Apache Kafka
Apache Spark
+6 more

As a Senior Engineer - Big Data Analytics, you will help the architectural design and development for Healthcare Platforms, Products, Services, and Tools to deliver the vision of the Company. You will significantly contribute to engineering, technology, and platform architecture. This will be done through innovation and collaboration with engineering teams and related business functions. This is a critical, highly visible role within the company that has the potential to drive significant business impact. 


The scope of this role will include strong technical contribution in the development and delivery of Big Data Analytics Cloud Platform, Products and Services in collaboration with execution and strategic partners. 

 

Responsibilities:

  • Design & develop, operate, and drive scalable, resilient, and cloud native Big Data Analytics platform to address the business requirements
  • Help drive technology transformation to achieve business transformation, through the creation of the Healthcare Analytics Data Cloud that will help Change establish a leadership position in healthcare data & analytics in the industry
  • Help in successful implementation of Analytics as a Service 
  • Ensure Platforms and Services meet SLA requirements
  • Be a significant contributor and partner in the development and execution of the Enterprise Technology Strategy

 

Qualifications:

  • At least 2 years of experience software development for big data analytics, and cloud. At least 5 years of experience in software development
  • Experience working with High Performance Distributed Computing Systems in public and private cloud environments
  • Understands big data open-source eco-systems and its players. Contribution to open source is a strong plus
  • Experience with Spark, Spark Streaming, Hadoop, AWS/Azure, NoSQL Databases, In-Memory caches, distributed computing, Kafka, OLAP stores, etc.
  • Have successful track record of creating working Big Data stack that aligned with business needs, and delivered timely enterprise class products
  • Experience with delivering and managing scale of Operating Environment
  • Experience with Big Data/Micro Service based Systems, SaaS, PaaS, and Architectures
  • Experience Developing Systems in Java, Python, Unix
  • BSCS, BSEE or equivalent, MSCS preferred
Read more
icon
Pune, Hyderabad
icon
3 - 12 yrs
icon
₹5L - ₹25L / yr
Apache Kafka
Big Data
Hadoop
Apache Hive
Java
+1 more

Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.

 

Must Have Skills

  • Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
  • Leading in the identification, isolation, resolution and communication of problems within the production environment.
  • Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
  • Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
  • Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
  • Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
  • Works on multiple platforms and multiple projects concurrently.
  • Performs code and unit testing for complex scope modules, and projects
  • Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
  • Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
  • Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
  • Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector,  JMS source connectors, Tasks, Workers, converters, Transforms.
  • Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
  • Working knowledge on Kafka Rest proxy.
  • Ensure optimum performance, high availability and stability of solutions.
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
  • Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.  Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem. 
  • Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
  • Ability to perform data related benchmarking, performance analysis and tuning.
  • Strong skills in In-memory applications, Database Design, Data Integration.
Read more

Service based company

Agency job
via Tech - Soul Technologies by Rohini Shinde
icon
Pune
icon
6 - 12 yrs
icon
₹6L - ₹28L / yr
Big Data
Apache Kafka
Data engineering
Cassandra
Java
+1 more

Primary responsibilities:

  • Architect, Design and Build high performance Search systems for personalization, optimization, and targeting
  • Designing systems with Solr, Akka, Cassandra, Kafka
  • Algorithmic development with primary focus Machine Learning
  • Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments
  • Participation in design and code reviews and recommend improvements
  • Unit testing with JUnit, Performance testing and tuning
  • Coordination with internal and external teams
  • Mentoring junior engineers
  • Participate in Product roadmap and Prioritization discussions and decisions
  • Evangelize the solution with Professional services and Customer Success teams

 

Read more
Agency job
via Nu-Pie by Sanjay Biswakarma
icon
Pune, Gandhinagar, Hyderabad
icon
4 - 5 yrs
icon
₹6L - ₹18L / yr
Java
JIRA
Hibernate (Java)
Spring MVC
Mockito
+13 more
work from home is applicable
candidate should have atleast 4 year experience
well known in full stack developer
location is in bangalore and pune
Relevant skills like java angular springboot react
Read more
DP
Posted by Sunil Kandukuri
icon
Pune, Nagpur, Bengaluru (Bangalore), Hyderabad
icon
4 - 7 yrs
icon
₹4L - ₹8L / yr
Java
Spring
Spring Boot
NOSQL Databases
DynamoDB
+4 more

Role: Java developer
Experience: 4+ years

Job description

○ Working experience on JAVA,Spring Boot. (on building web services?)

○ NOSQL DynamoDB knowledge is plus

○ Working experience in building micro services and distributed systems

○ Working experience on using messaging queues RabbitMQ/Kafka is plus

Read more
icon
Remote, Pune
icon
4 - 9 yrs
icon
₹10L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more

You will work on: 

 

We help many of our clients make sense of their large investments in data – be it building analytics solutions or machine learning applications. You will work on cutting-edge cloud-native technologies to crunch terabytes of data into meaningful insights. 

 

What you will do (Responsibilities):

 

Collaborate with product management & engineering to build highly efficient data pipelines. 

You will be responsible for:

 

  • Dealing with large customer data and building highly efficient pipelines
  • Building insights dashboards
  • Troubleshooting data loss, data inconsistency, and other data-related issues
  • Product development environment delivering stories in a scaled agile delivery methodology.

 

What you bring (Skills):

 

5+ years of experience in hands-on data engineering & large-scale distributed applications

 

  • Extensive experience in object-oriented programming languages such as Java or Scala
  • Extensive experience in RDBMS such as MySQL, Oracle, SQLServer, etc.
  • Experience in functional programming languages such as JavaScript, Scala, or Python
  • Experience in developing and deploying applications in Linux OS
  • Experience in big data processing technologies such as Hadoop, Spark, Kafka, Databricks, etc.
  • Experience in Cloud-based services such as Amazon AWS, Microsoft Azure, or Google Cloud Platform
  • Experience with Scrum and/or other Agile development processes
  • Strong analytical and problem-solving skills

 

Great if you know (Skills):

 

  • Some exposure to containerization technologies such as Docker, Kubernetes, or Amazon ECS/EKS
  • Some exposure to microservices frameworks such as Spring Boot, Eclipse Vert.x, etc.
  • Some exposure to NoSQL data stores such as Couchbase, Solr, etc.
  • Some exposure to Perl, or shell scripting.
  • Ability to lead R&D and POC efforts
  • Ability to learn new technologies on his/her own
  • Team player with self-drive to work independently
  • Strong communication and interpersonal skills

Advantage Cognologix:

  •  A higher degree of autonomy, startup culture & small teams
  •  Opportunities to become an expert in emerging technologies
  •  Remote working options for the right maturity level
  •  Competitive salary & family benefits
  •  Performance-based career advancement


About Cognologix: 

 

Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business-first approach to help meet our client’s strategic goals.

We are a Data focused organization helping our clients to deliver their next generation of products in the most efficient, modern, and cloud-native way.

Benefits Working With Us:

  • Health & Wellbeing
  • Learn & Grow
  • Evangelize 
  • Celebrate Achievements
  • Financial Wellbeing
  • Medical and Accidental cover.
  • Flexible Working Hours.
  • Sports Club & much more.
Read more
Agency job
via Volks Consulting by SHUBHAM MAGDUM
icon
Remote, Bengaluru (Bangalore), Pune, Mumbai
icon
2 - 6 yrs
icon
₹20L - ₹45L / yr
Java
Spring
Data Structures
Algorithms
Apache Kafka
+4 more
  •  2 - 6 years of software development experience
  •  Good grasp on programming fundamentals including OOP, Design Patterns and Data Structures
  •  Excellent analytical, logical and problem-solving skills
  • Software Development Engineer
  • Good understanding of complexities involved in designing/developing large scale systems
  • Strong system design skills
  •  Experience in technologies like Elasticsearch, Redis, Kafka etc
  • Good knowledge of relational and NoSQL databases
  • Familiarity with common machine learning algorithms. In-depth knowledge is a plus
  • Experience of working with big data technologies like Hadoop, Spark, Hive is a big plus
  • Ability to understand business requirements and take ownership of the work
  • Exhibit passion and enthusiasm for building and maintaining large scale platforms
Read more

Online ENT Healthcare giant in India

Agency job
via The Hub by Sridevi Viswanathan
icon
Remote, Bengaluru (Bangalore), Chennai, Hyderabad, Mumbai, Pune
icon
3 - 8 yrs
icon
₹5L - ₹17L / yr
Java
Spring Boot
Apache Kafka
MySQL
java
+1 more

Software Development Engineer:

Major Responsibilities:

  • Translation of complex functional requirements into technical requirements, implementing and maintaining a coherent and progressive development strategy for our product line
  • Design, develop and maintain complex systems using best of the breed development practices and technology.
  • Responsible for the over-all software development life cycle.
  • Delivery of High Quality, Scalable and Extensible systems and applications on-time and on-budget.
  • Adoption and Evolution of the software engineering practices and tools within the organization
  • Keep in sync with the latest technology developments and open source offerings. Evaluate and adopt them for solving business problem of organization.
  • Collaborate with other technology and business teams within the organization to provide efficient robust solutions to the problems.
  • Drive and manage the bug triage process
  • Report on status of product delivery and quality to management, customer support and product teams.

Desired Skills

  • Strong programming, debugging, and problem-solving skills
  • Strong understanding of data structures and algorithms
  • Sound understanding of object-oriented programming and excellent software design skills.
  • Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / JavaSpringboot applications (API’s).
  • Strong understanding of database design and SQL (mySql/mariaDB) development
  • Good to have knowledge of NoSQL technologies like MongoDB, Solr, Redis, Cassandra or any other NoSQL database
  • Knowledge of design patterns and good to have experience of large-scale applications
  • Should have experience in Apache Kafka, RabbitMQ or other Queueing systems.

Ideal Experience

  • 3 to 8 years of industry experience.
  • Bachelors or Master’s Degree in Computer Science/ IT
  • Drive discussions to create/improve product, process and technology
  • Provide end to end solution and design details
  • Lead development of formalized solution methodologies
  • Passion to work in startup like environment

Personal Characteristics

  • Passion and commitment
  • Strong and excellent software design intellect
  • High integrity
  • Self-starter
Read more
DP
Posted by Rashmi Poovaiah
icon
Bengaluru (Bangalore), Chennai, Pune
icon
4 - 10 yrs
icon
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
icon
Remote, Pune
icon
3 - 8 yrs
icon
₹4L - ₹15L / yr
Big Data
Hadoop
Java
Spark
Hibernate (Java)
+5 more
ob Title/Designation:
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
  • 4-10 years of experience in software development.
  • At least 2 years of relevant work experience on large scale Data applications.
  • Strong coding experience in Java is mandatory
  • Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
  • Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
  • Should have good working experience on
  • o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
  • o Kafka
  • o J2EE Frameworks (Spring/Hibernate/REST)
  • o Spark Streaming or any other streaming technology.
  • Strong coding experience in Java is mandatory
  • Ability to work on the sprint stories to completion along with Unit test case coverage.
  • Experience working in Agile Methodology
  • Excellent communication and coordination skills
  • Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
  • Must be able to integrate quickly into the team and work independently towards team goals
Role & Responsibilities:
  • Take the complete responsibility of the sprint stories' execution
  • Be accountable for the delivery of the tasks in the defined timelines with good quality.
  • Follow the processes for project execution and delivery.
  • Follow agile methodology
  • Work with the team lead closely and contribute to the smooth delivery of the project.
  • Understand/define the architecture and discuss the pros-cons of the same with the team
  • Involve in the brainstorming sessions and suggest improvements in the architecture/design.
  • Work with other team leads to get the architecture/design reviewed.
  • Work with the clients and counter-parts (in US) of the project.
  • Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Education: BE/B.Tech from reputed institute.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
Read more
DP
Posted by Amol K
icon
Pune
icon
3 - 7 yrs
icon
Best in industry
Spring Boot
Java
J2EE
Spring
Spring Batch
+3 more
Sapper.AI is building the next generation intelligent automation software. We are a young startup and if you are looking at exciting work, long hours and lot of learning, have a passion to create new innovation and go-getter attitude, this is the place to be.

Built on a foundation of AI we are automating enterprise application integration, data integration, data preparation for Analytics and bot automation. We are looking to build our engineering development center in Pune with passionate and entrepreneurial developers at all levels (Interns, Fresh Graduates, Senior Software Engineers and Architects).

Expectations -
  • Have at least 3 years work experience in Java 8 or higher / J2EE Java development.
  • Have experience of agile systems development methodologies such as SCRUM
  • Experience in designing the solution and implementation.
  • Is a communicative, positive, outgoing and driven team player.
  • Solution-oriented, see opportunities and proactively proposing new solutions, speak and write fluently in English.
  • Good to have certifications in Java, Spring etc.
  • Experience in Java 8.
  • Experience in Spring Boot and other spring framework like Spring data, AOP etc.
  • Experience in MongoDB/ Kafka / RabbitMQ etc.
  • Experience in REST API
  • Have worked on microservices
  • Should have worked on minimum 2- 3 projects
  • Experience in writing effective Unit test case for better coverage.
  • Experience in writing good quality code by following code quality tools likes SonarQube etc.
Read more
DP
Posted by Amol K
icon
Pune
icon
8 - 16 yrs
icon
Best in industry
Java
Apache Kafka
Spring Boot
Technical Architecture
Apache Camel
+6 more
Sapper.AI is building the next generation intelligent automation software. We are a young startup and if you are looking at exciting work, long hours and lot of learning, have a passion to create new innovation and go-getter attitude, this is the place to be.

Built on a foundation of AI we are automating enterprise application integration, data integration, data preparation for Analytics and bot automation. We are looking to build our engineering development center in Pune with passionate and entrepreneurial developers at all levels (Interns, Fresh Graduates, Senior Software Engineers and Architects).


As a Architect/Technology Lead you will be involved in design and development of enterprise automation. Knowledge of building workflow engines, microservices design patterns, experience with large scale enterprise architectures, springboot, kafka, data management and caching is needed. At a startup you will be wearing multiple hats engineering, presales, talking to customers, setting up operational processes.
Read more
icon
Pune
icon
2 - 8 yrs
icon
₹3L - ₹12L / yr
Java
Product development
RESTful APIs
Spring
Hibernate (Java)
+4 more
Do you have a passion to be a part of an innovative startup? Here’s an opportunity for you - become an active member of our core platform development team.
Main Duties
Contribute in all phases of the development lifecycle
Write well designed, testable, efficient code
Ensure designs are in compliance with specifications
Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
Prepare and produce releases of software components

Role & Relationships
We consider ourselves a team & you will be a valuable part of it. You could be reporting to a Senior member or directly to our Founder, CEO
Educational Qualifications
We don’t discriminate. As long as you have the required skill set & the right attitude
Experience
Upto seven years of experience, preferably working on Java. 
Skills
Good
Strong understanding of Core Java , Servlet, JSP
Knowledge of RDBMS (MySQL, Oracle, SQL Server), NoSQL
Knowledge of RESTful Web Services, XML, JSON, Spring
Good team player

Even better
Familiarity with the software development lifecycle
Strong Full stack developer development background with frontend and backend web applications
Competencies
An aptitude to solve problems & learn something new
Highly self-motivated
Analytical frame of mind
Ability to work in fast-paced, dynamic environment

Location
Currently in Pune
Remuneration
Once we meet, we shall make an offer depending on how good a fit you are & the experience you already have
About us
Aikon Labs Pvt Ltd is a start-up focused on Realizing Ideas. One such idea is iEngage.io, our Intelligent Engagement Platform. We leverage Augmented Intelligence, a combination of machine-driven insights & human understanding, to serve a timely response to every interaction from the people you care about.
Read more
icon
Pune
icon
4 - 7 yrs
icon
₹8L - ₹14L / yr
Java
Spring
Microservices
Apache Kafka
Message Queuing Telemetry Transport (MQTT)
+2 more
We are looking for an experienced Java developer who will help us build scalable REST API based backend using Microservices. Key skills - Own the product functionality and work with the technical and product leadership to convert ideas into great product - Stay abreast of latest back-end technologies and patterns and proactively find ways to apply them to the business problem - Thorough understanding of core Java, Spring framework - Experience with Spring Boot to bootstrap applications - Good understanding and working experience with RESTful web services - Knowledge of distributed systems and how they are different from traditional monolith applications You get additional brownie points if you have - Knowledge of modern authorization mechanisms, such as JSON Web Token and OAuth2 - Familiarity with code versioning tools such as Git etc - Self-starter who can think outside of the box, and come up with a solution to resolve and mitigate complex problems - Experience working in Agile development environment using methodologies like Scrum and tools like JIRA, Confluence etc Experience - 4-7 years of work experience developing Java based backend applications - Around 1 year of work experience e using Spring Boot, Spring Cloud and Microservices - BE/B Tech or higher preferably in Computer Science About Us QUp is a leading healthcare product that is excited to offer a “Painless OPD” experience to patients and health care providers like doctors, hospitals etc. We are a fast growing startup that is using innovation and cutting edge technologies to solve the OPD management problem. We offer competitive salary, freedom to explore cutting edge tools & technologies, flat hierarchy & open communication channels to our people so that they continue to be growth drivers for the company.
Read more
DP
Posted by Anurag Gaur
icon
Pune
icon
1 - 5 yrs
icon
₹6L - ₹10L / yr
NodeJS (Node.js)
NOSQL Databases
Java
Apache Storm
Apache Kafka
+1 more
ABOUT MOOSHAK We're at a point where the urban English-speaking Indian population is almost all online.The next billion Indians online all communicate via Indian languages. Mooshak was created with the singular aim of making the Internet fun and relevant for this large, untapped segment. At Mooshak, we want to connect and engage Indians in their own language. And that presents problems in various domains, from creativity in content creation, to creating a highly scalable platform, to applying techniques in AI and NLP in Indian languages to understand what people are saying and react to what they want. Mooshak is architected to scale. Irrespective of the number of followers, the read time for a feed remains constant. We achieve this by using distributed message queues and a distributed computing engine and some nifty caching! TECHNICAL RESPONSIBILITIES Mooshak’s Tech Stack Java Node.js Mongo DB Redis Apache Kafka & Apache Storm Nginx / Jenkins Server Developer’s Roles and Responsibilities You are expected to know at least 4 of these technologies with the ability to quickly learn the others. You will play the leading role in all stages of server development Architecture Coding Final testing Shipping The APIs are written and the product works fine. You are expected to understand the architecture and enhance product functionality. Sometimes you may be required to double up as the Dev Ops guy should the servers fail or the product not be working as expected. The core APIs are written in Node.js The distributed message queue (Kafka) and compute engine (Storm) are implemented in Java. Understanding of Angular 2 is a big plus as our Web app is built on the same. NON TECHNICAL RESPONSIBILITIES We are a startup. This means that: You will be expected to be someone who comes up with solutions instead of problems. You will be expected to work non stop including weekends if the servers crash. But otherwise we are quite chill! You will be expected to talk to multiple stakeholders customers, designer, client side developer to achieve user and business needs. A high aptitude and a positive attitude are a must You should be comfortable working independently as well as in a team. We are a lean team right now, with you as the only server developer (assisted by the folks who built the platform) JOB LOCATION You would be working out of our office in Pune. You may be required to travel occasionally to Mumbai or Bangalore to interact with some other team members.
Read more
DP
Posted by Ramakrishna Murthy
icon
Pune
icon
3 - 7 yrs
icon
₹10L - ₹15L / yr
HDFS
Apache Flume
Apache HBase
Hadoop
Impala
+3 more
Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.
Read more
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort