Cutshort logo
Apache kafka jobs

50+ Apache Kafka Jobs in India

Apply to 50+ Apache Kafka Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Kafka Jobs and apply today!

icon
Masters India Private Limited

at Masters India Private Limited

3 candid answers
2 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Noida
7yrs+
Upto ₹45L / yr (Varies
)
skill iconPython
skill iconDjango
FastAPI
skill iconPostgreSQL
skill iconMongoDB
+9 more

We are looking for a customer-obsessed, analytical Sr. Staff Engineer to lead the development and growth of our Tax Compliance product suite. In this role, you’ll shape innovative digital solutions that simplify and automate tax filing, reconciliation, and compliance workflows for businesses of all sizes. You will join a fast-growing company where you’ll work in a dynamic and competitive market, impacting how businesses meet their statutory obligations with speed, accuracy, and confidence.


As the Sr. Staff Engineer, you’ll work closely with product, DevOps, and data teams to architect reliable systems, drive engineering excellence, and ensure high availability across our platform. We’re looking for a technical leader who’s not just an expert in building scalable systems, but also passionate about mentoring engineers and shaping the future of fintech.


Responsibilities

  • Lead, mentor, and inspire a high-performing engineering team (or operate as a hands-on technical lead).
  • Drive the design and development of scalable backend services using Python.
  • Experience in Django, FastAPI, Task Orchestration Systems.
  • Own and evolve our CI/CD pipelines with Jenkins, ensuring fast, safe, and reliable deployments.
  • Architect and manage infrastructure using AWS and Terraform with a DevOps-first mindset.
  • Collaborate cross-functionally with product managers, designers, and compliance experts to deliver features that make tax compliance seamless for our users.
  • Set and enforce engineering best practices, code quality standards, and operational excellence.
  • Stay up-to-date with industry trends and advocate for continuous improvement in engineering processes.
  • Experience in fintech, tax, or compliance industries.
  • Familiarity with containerization tools like Docker and orchestration with Kubernetes.
  • Background in security, observability, or compliance automation.

Requirements

  • 7+ years of software engineering experience, with at least 2+ years in a leadership or principal-level role.
  • Deep expertise in Python, including API development, performance optimization, and testing.
  • Experience in Event-driven architecture, Kafka/RabbitMQ-like systems.
  • Strong experience with AWS services (e.g., ECS, Lambda, S3, RDS, CloudWatch).
  • Solid understanding of Terraform for infrastructure as code.
  • Proficiency with Jenkins or similar CI/CD tooling.
  • Comfortable balancing technical leadership with hands-on coding and problem-solving.
  • Strong communication skills and a collaborative mindset.
Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai
3 - 5 yrs
₹10L - ₹25L / yr
skill iconPython
PySpark
skill iconScala
Data engineering
ETL
+12 more

About Moative

Moative, an Applied AI company, designs and builds transformation AI solutions for traditional industries in energy, utilities, healthcare & lifesciences, and more. Through Moative Labs, we build AI micro-products and launch AI startups with partners in vertical markets that align with our theses.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Our Team: Our team of 20+ employees consist of data scientists, AI/ML Engineers, and mathematicians from top engineering and research institutes such as IITs, CERN, IISc, UZH, Ph.Ds. Our team includes academicians, IBM Research Fellows, and former founders.


Work you’ll do

As a Data Engineer, you will work on data architecture, large-scale processing systems, and data flow management. You will build and maintain optimal data architecture and data pipelines, assemble large, complex data sets, and ensure that data is readily available to data scientists, analysts, and other users. In close collaboration with ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade solutions that directly impact business outcomes. Ultimately, you will be responsible for developing and implementing systems that optimize the organization’s data use and data quality.


Responsibilities

  • Create and maintain optimal data architecture and data pipelines on cloud infrastructure (such as AWS/ Azure/ GCP)
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements
  • Build the pipeline infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Support development of analytics that utilize the data pipeline to provide actionable insights into key business metrics
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs


Who you are

You are a passionate and results-oriented engineer who understands the importance of data architecture and data quality to impact solution development, enhance products, and ultimately improve business applications. You thrive in dynamic environments and are comfortable navigating ambiguity. You possess a strong sense of ownership and are eager to take initiative, advocating for your technical decisions while remaining open to feedback and collaboration. 


You have experience in developing and deploying data pipelines to support real-world applications. You have a good understanding of data structures and are excellent at writing clean, efficient code to extract, create and manage large data sets for analytical uses. You have the ability to conduct regular testing and debugging to ensure optimal data pipeline performance. You are excited at the possibility of contributing to intelligent applications that can directly impact business services and make a positive difference to users.


Skills & Requirements

  • 3+ years of hands-on experience as a data engineer, data architect or similar role, with a good understanding of data structures and data engineering.
  • Solid knowledge of cloud infra and data-related services on AWS (EC2, EMR, RDS, Redshift) and/ or Azure.
  • Advanced knowledge of SQL, including writing complex queries, stored procedures, views, etc.
  • Strong experience with data pipeline and workflow management tools (such as Luigi, Airflow).
  • Experience with common relational SQL, NoSQL and Graph databases.
  • Strong experience with scripting languages: Python, PySpark, Scala, etc.
  • Practical experience with basic DevOps concepts: CI/CD, containerization (Docker, Kubernetes), etc
  • Experience with big data tools (Spark, Kafka, etc) and stream processing.
  • Excellent communication skills to collaborate with colleagues from both technical and business backgrounds, discuss and convey ideas and findings effectively.
  • Ability to analyze complex problems, think critically for troubleshooting and develop robust data solutions.
  • Ability to identify and tackle issues efficiently and proactively, conduct thorough research and collaborate to find long-term, scalable solutions.


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less. Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that out loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply. We encourage you to apply even if you believe you do not meet all the requirements listed above.  


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers. 


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Bengaluru (Bangalore)
4 - 9 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Apache Kafka
Data Structures
Algorithms
+2 more

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹10L - ₹25L / yr
Microsoft Windows Azure
Data engineering
skill iconPython
Apache Kafka

Role Overview:

We are seeking a Senior Software Engineer (SSE) with strong expertise in Kafka, Python, and Azure Databricks to lead and contribute to our healthcare data engineering initiatives. This role is pivotal in building scalable, real-time data pipelines and processing large-scale healthcare datasets in a secure and compliant cloud environment.

The ideal candidate will have a solid background in real-time streaming, big data processing, and cloud platforms, along with strong leadership and stakeholder engagement capabilities.

Key Responsibilities:

  • Design and develop scalable real-time data streaming solutions using Apache Kafka and Python.
  • Architect and implement ETL/ELT pipelines using Azure Databricks for both structured and unstructured healthcare data.
  • Optimize and maintain Kafka applications, Python scripts, and Databricks workflows to ensure performance and reliability.
  • Ensure data integrity, security, and compliance with healthcare standards such as HIPAA and HITRUST.
  • Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into robust data solutions.
  • Mentor junior engineers, perform code reviews, and promote engineering best practices.
  • Stay current with evolving technologies in cloud, big data, and healthcare data standards.
  • Contribute to the development of CI/CD pipelines and containerized environments (Docker, Kubernetes).

Required Skills & Qualifications:

  • 4+ years of hands-on experience in data engineering roles.
  • Strong proficiency in Kafka (including Kafka Streams, Kafka Connect, Schema Registry).
  • Proficient in Python for data processing and automation.
  • Experience with Azure Databricks (or readiness to ramp up quickly).
  • Solid understanding of cloud platforms, with a preference for Azure (AWS/GCP is a plus).
  • Strong knowledge of SQL and NoSQL databases; data modeling for large-scale systems.
  • Familiarity with containerization tools like Docker and orchestration using Kubernetes.
  • Exposure to CI/CD pipelines for data applications.
  • Prior experience with healthcare datasets (EHR, HL7, FHIR, claims data) is highly desirable.
  • Excellent problem-solving abilities and a proactive mindset.
  • Strong communication and interpersonal skills to work in cross-functional teams.


Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Hyderabad
3 - 4 yrs
₹4L - ₹8L / yr
ASP.NET
skill iconC#
skill iconReact.js
skill iconJavascript
TypeScript
+14 more

Job Overview:

We are looking for a highly skilled Full-Stack Developer with expertise in .NET Core, to develop and maintain scalable web applications and microservices. The ideal candidate will have strong problem-solving skills, experience in modern software development, and a passion for creating robust, high-performance applications.


Key Responsibilities:


Backend Development:

  • Design, develop, and maintain microservices and APIs using.NET Core. Should have a good understanding of .NET Framework.
  • Implement RESTful APIs, ensuring high performance and security.
  • Optimize database queries and design schemas for SQL Server / Snowflake / MongoDB.

Software Architecture & DevOps:

  • Design and implement scalable microservices architecture.
  • Work with Docker, Kubernetes, and CI/CD pipelines for deployment and automation.
  • Ensure best practices in security, scalability, and performance.

Collaboration & Agile Development:

  • Work closely with UI/UX designers, backend engineers, and product managers.
  • Participate in Agile/Scrum ceremonies, code reviews, and knowledge-sharing sessions.
  • Write clean, maintainable, and well-documented code.


Required Skills & Qualifications:

  • 7+ years of experience as a Full-Stack Developer.
  • Strong experience in .NET Core, C#.
  • Proficiency in React.js, JavaScript (ES6+), TypeScript.
  • Experience with RESTful APIs, Microservices architecture.
  • Knowledge of SQL / NoSQL databases (SQL Server, Snowflake, MongoDB).
  • Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
  • Familiarity with Cloud services (Azure, AWS, or GCP) is a plus.
  • Strong debugging and troubleshooting skills.


Nice-to-Have:

  • Experience with GraphQL, gRPC, WebSockets.
  • Exposure to serverless architecture and cloud-based solutions.
  • Knowledge of authentication/authorization frameworks (OAuth, JWT, Identity Server).
  • Experience with unit testing and integration testing.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Bengaluru (Bangalore)
5 - 8 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Apache Kafka
Microservices
Multithreading
+2 more

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Bengaluru (Bangalore)
3 - 8 yrs
₹17L - ₹25L / yr
PySpark
Windows Azure
Microsoft Windows Azure
skill iconAmazon Web Services (AWS)
SQL
+3 more

Employment type- Contract basis


Key Responsibilities

  • Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
  • Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
  • Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
  • Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
  • Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
  • Maintain documentation and implement best practices for data architecture, governance, and security.

⚙️ Required Skills

  • Programming: Proficient in PySpark, Python, and SQL.
  • Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
  • Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
  • Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
  • Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
  • CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.

🧰 Preferred Qualifications

  • Bachelor's or Master's in Computer Science, Engineering, or related field.
  • Certifications in Azure/AWS are highly desirable.
  • Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.



Read more
Deqode

at Deqode

1 recruiter
Apoorva Jain
Posted by Apoorva Jain
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Bengaluru (Bangalore), Mumbai, Nagpur, Ahmedabad, Kochi (Cochin), Chennai
6 - 11 yrs
₹4L - ₹15L / yr
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
Spring

Job Description:

We are looking for a Senior Java Developer with strong expertise in Apache Kafka and backend systems. The ideal candidate will have hands-on experience in Java (8/11+), Spring Boot, and building scalable, real-time data pipelines using Kafka.

Key Responsibilities:

  • Develop and maintain backend services using Java and Spring Boot
  • Design and implement Kafka-based messaging and streaming solutions
  • Optimize Kafka performance (topics, partitions, consumers)
  • Collaborate with cross-functional teams to deliver scalable microservices
  • Ensure code quality and maintain best practices in a distributed environment

Required Skills:

  • 6+ years in Java development
  • 3+ years of hands-on Kafka experience (producers, consumers, streams)
  • Strong knowledge of Spring Boot, REST APIs, and microservices
  • Familiarity with Kafka Connect, Schema Registry, and stream processing
  • Experience with containerization (Docker), CI/CD, and cloud platforms (AWS/GCP/Azure)


Read more
Chennai
4 - 8 yrs
₹6L - ₹18L / yr
skill iconJava
06692
Apache Kafka
skill iconPostgreSQL
RESTful APIs

We are seeking a highly skilled and experienced Full Stack Developer with a strong background in product engineering. The ideal candidate will have hands-on experience in Java Spring Boot, Kafka, PostgreSQL, and REST API development. You will be working in a collaborative, cross-cultural environment, contributing to the design, development, and deployment of scalable software products.


Key Responsibilities


Design, develop, test, and maintain high-quality software solutions across the full stack.

Build and implement RESTful APIs to support frontend and backend services.

Collaborate with cross-functional teams across different cultures and time zones.

Work closely with product and engineering teams to understand requirements and deliver features.

Ensure code quality, performance, and scalability.

Participate in code reviews and knowledge-sharing activities.


Required Skills & Qualifications

5–6 years of experience as a Full Stack Developer.

Proven experience in Java Spring Boot development.

Proficiency in Kafka for real-time data streaming and messaging.

Strong knowledge of PostgreSQL and database design.

Experience in designing and implementing REST APIs.

Background in product engineering and working on scalable software platforms.

Excellent verbal and written communication skills.

Experience working in multi-cultural and global team environments.


Nice to Have (Preferred)

Experience with frontend frameworks (e.g., React, Angular, Vue).

Familiarity with CI/CD pipelines and DevOps practices.

Cloud platform experience (AWS).

Read more
KJBN labs

at KJBN labs

2 candid answers
sakthi ganesh
Posted by sakthi ganesh
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹30L / yr
Hadoop
Apache Kafka
Spark
skill iconPython
skill iconJava
+8 more

Senior Data Engineer Job Description

Overview

The Senior Data Engineer will design, develop, and maintain scalable data pipelines and

infrastructure to support data-driven decision-making and advanced analytics. This role requires deep

expertise in data engineering, strong problem-solving skills, and the ability to collaborate with

cross-functional teams to deliver robust data solutions.

Key Responsibilities


Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data

pipelines to ingest, process, and transform large volumes of structured and unstructured data.

Data Architecture: Architect and maintain data storage solutions, including data lakes, data

warehouses, and databases, ensuring performance, scalability, and cost-efficiency.

Data Integration: Integrate data from diverse sources, including APIs, third-party systems,

and streaming platforms, ensuring data quality and consistency.

Performance Optimization: Monitor and optimize data systems for performance, scalability,

and cost, implementing best practices for partitioning, indexing, and caching.

Collaboration: Work closely with data scientists, analysts, and software engineers to

understand data needs and deliver solutions that enable advanced analytics, machine

learning, and reporting.

Data Governance: Implement data governance policies, ensuring compliance with data

security, privacy regulations (e.g., GDPR, CCPA), and internal standards.

Automation: Develop automated processes for data ingestion, transformation, and validation

to improve efficiency and reduce manual intervention.

Mentorship: Guide and mentor junior data engineers, fostering a culture of technical

excellence and continuous learning.

Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high

availability and reliability of data systems.

Required Qualifications

Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,

or a related field.

Experience: 5+ years of experience in data engineering or a related role, with a proven track

record of building scalable data pipelines and infrastructure.

Technical Skills:

Proficiency in programming languages such as Python, Java, or Scala.

Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).

Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services

(e.g., Redshift, BigQuery, Snowflake).

Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and

data integration frameworks.

Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed

systems.

Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a

plus.

Soft Skills:

Excellent problem-solving and analytical skills.

Strong communication and collaboration abilities.

Ability to work in a fast-paced, dynamic environment and manage multiple priorities.

Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,

Google Professional Data Engineer) or relevant data engineering certifications.

Preferred Qualifica

Experience with real-time data processing and streaming architectures.

Familiarity with machine learning pipelines and MLOps practices.

Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data

pipelines.

Experience in industries with high data complexity, such as finance, healthcare, or

e-commerce.

Work Environment

Location: Hybrid/Remote/On-site (depending on company policy).

Team: Collaborative, cross-functional team environment with data scientists, analysts, and

business stakeholders.

Hours: Full-time, with occasional on-call responsibilities for critical data systems.

Read more
Zenius IT Services Pvt Ltd
Bengaluru (Bangalore), Chennai, Hyderabad
4 - 9 yrs
₹6L - ₹18L / yr
Apache Kafka
DynamoDB
skill iconRedis
skill iconAmazon Web Services (AWS)
Windows Azure
+7 more

About the Role

We are looking for a skilled Backend Engineer with strong experience in building scalable microservices, integrating with distributed data systems, and deploying web APIs that serve UI applications in the cloud. You’ll work on high-performance systems involving Kafka, DynamoDB, Redis, and other modern backend technologies.


Responsibilities

  • Design, develop, and deploy backend microservices and APIs that power UI applications.
  • Implement event-driven architectures using Apache Kafka or similar messaging platforms.
  • Build scalable and highly available systems using NoSQL databases (e.g., DynamoDB, MongoDB).
  • Optimize backend systems using caching layers like Redis to enhance performance.
  • Ensure seamless deployment and operation of services in cloud environments (AWS, GCP, or Azure).
  • Write clean, maintainable, and well-tested code; contribute to code reviews and architecture discussions.
  • Collaborate closely with frontend, DevOps, and product teams to deliver integrated solutions.
  • Monitor and troubleshoot production issues and participate in on-call rotations as needed.


Required Qualifications

  • 3–7 years of professional experience in backend development.
  • Strong programming skills in one or more languages: Java, Python, Go, Node.js.
  • Hands-on experience with microservices architecture and API design (REST/gRPC).
  • Practical experience with Kafka, RabbitMQ, or other event streaming/message queue systems.
  • Solid knowledge of NoSQL databases, especially DynamoDB or equivalents.
  • Experience using Redis or Memcached for caching or pub/sub mechanisms.
  • Proficiency with cloud platforms (preferably AWS – e.g., Lambda, ECS, EKS, API Gateway).
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines.


Read more
strektech
Venitha N
Posted by Venitha N
Chennai
5 - 10 yrs
₹15L - ₹30L / yr
skill iconJava
Apache Kafka
Messaging
Multithreading
Microservices

Hiring for Java Developer


Experience : 5 to 10 yrs

Notice Period : 0 to 15 days

Location : Pune

Work Mode : WFO (5 days)


As Java developer you would be expected to perform many duties throughout the development lifecycle of applications, from concept and design right through to testing. Here are some of the responsibilities you may have:


Develop high-level design and define software architecture

Implement and maintain quality systems within the group

Proficiently estimates, design approaches and nimbly move to alternate apporaches, if needed, develop and execute unit test strategies

Monitor and track tasks, and report status

Assist project heads to conceptualize, design, develop, test and implement technology solutions

Effectively collaborate with stakeholders and users to ensure customer satisfaction


Skill Set :

Java 7 / Java 8 with microservices, Multithreading, Springboot, Junit, kafka, Splunk (Good to have), Open Shift (Good to Have), Authentication/ Spring Security (Good to have)


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Hyderabad, Bengaluru (Bangalore), Mumbai, Pune, Gurugram, Chennai
4 - 10 yrs
₹5L - ₹20L / yr
Red Hat PAM Developer
Red Hat Process Automation Manager (PAM)
jBPM
JBPM (Java Business Process Management)
BPMN 2.0 (Business Process Model and Notation)
+6 more

Job Title : Red Hat PAM Developer

Experience Required :

  • Relevant Experience : 4+ Years
  • Total Experience : Up to 8 Years

No. of Positions : 4

Work Locations : Hyderabad / Bangalore / Mumbai / Pune / Gurgaon / Chennai

Work Mode : Hybrid

Work Timings : 1:00 PM to 10:00 PM IST

Interview Mode : Virtual

Interview Rounds : 2


Mandatory Skills :

  • Excellent communication skills – must be comfortable in client-facing roles
  • Red Hat Process Automation Manager (PAM)
  • JBPM (Java Business Process Management)
  • BPMN 2.0 (Business Process Model and Notation) – low-code platform
  • DMN (Decision Model and Notation) – business processes and rules
  • Spring Boot
  • JavaScript

Good-to-Have Skills :

  • Red Hat Fuse
  • Apache Kafka
  • Apache Camel
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Bengaluru (Bangalore)
11 - 18 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Multithreading
Data Structures
+3 more

Java Developer – Job Description


Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 5 to 12 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 9 yrs
₹40L - ₹60L / yr
skill iconPython
SQL
Data engineering
Snowflake
ETL
+5 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 

 

 

Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 6 yrs
₹2L - ₹12L / yr
skill iconPython
skill iconDjango
skill iconPostgreSQL
MySQL
SQL
+17 more

Job Title : Python Django Developer

Experience : 3+ Years

Location : Gurgaon

Working Days : 6 Days (Monday to Saturday)


Job Summary :

We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.

The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.


Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.


Key Responsibilities :

  • Write efficient, reusable, testable, and scalable code using the Django framework.
  • Develop backend components, server-side logic, and statistical models.
  • Design and implement high-availability, low-latency applications with robust data protection and security.
  • Contribute to the development of highly responsive web applications.
  • Collaborate with cross-functional teams on system design and integration.

Mandatory Skills :

  • Strong programming skills in Python and Django (or similar frameworks like Flask).
  • Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
  • Strong understanding of SQL and NoSQL ORM.
  • Solid grasp of data structures, multithreading, and operating system concepts.
  • Experience with RESTful API development and implementation of API security.
  • Knowledge of JSON/XML and their use in data exchange.

Good-to-Have Skills :

  • Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
  • Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
  • Familiarity with MongoDB and other NoSQL databases
  • Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
  • Knowledge in building and integrating statistical learning models.
Read more
iris software

at iris software

2 candid answers
Parveen Kaur
Posted by Parveen Kaur
Pune, Noida
5 - 8 yrs
₹20L - ₹25L / yr
skill iconJava
Apache Kafka
06692
Multithreading
Microservices
+1 more

Job Role: We are seeking a skilled Java Developer to contribute to the development and enhancement renowned banking application, which supports automatic reconciliation and unified data reporting for their clients. This role involves working on high-impact enhancements, data pipeline integration, and platform modernization. The ideal candidate will be a quick learner, self-motivated, and able to ramp up quickly in a fast-paced environment. 


Key Responsibilities:

 Design, develop, and maintain Java-based applications using Java 17 and Spring Boot.

 Implement and manage message routing using Apache Camel.

 Develop and monitor data pipelines using Kafka.

 Support and enhance existing cloud-native applications.

 Work with OpenShift Container Platform (OCP 4) for container orchestration and deployments.

 Utilize Jenkins for CI/CD pipeline automation and management.

 Collaborate with cross-functional teams to integrate multiple data sources into a unified reporting platform.

 Participate in code reviews, unit testing, and performance tuning.

 Troubleshoot and resolve production issues in collaboration with operations teams.

 Document development processes and system configurations.



Required Skills:

 Strong proficiency in Java 17 and Spring Boot frameworks.

 Hands-on experience with Apache Camel for message routing and transformation.

 Solid experience in Kafka development and monitoring tools.

 Good understanding of cloud pipeline architectures and deployment strategies.

 Experience working with OpenShift (OCP 4).

 Familiarity with Jenkins for CI/CD and automated deployments.

 Understanding of cloud deployment platforms (AWS, Azure, or GCP preferred).

 Strong analytical and debugging skills.

 Ability to learn quickly and adapt to evolving project requirements.



Nice to Have:

 Experience in financial services or transaction reporting platforms.

 Familiarity with microservices architecture and containerization best practices.

 Knowledge of monitoring tools (e.g., Prometheus, Grafana).

Read more
Hyderabad
5 - 8 yrs
₹24L - ₹30L / yr
Apache Kafka
skill iconElastic Search
skill iconNodeJS (Node.js)
ETL
skill iconPython
+2 more

Company Overview

We are a dynamic startup dedicated to empowering small businesses through innovative technology solutions. Our mission is to level the playing field for small businesses by providing them with powerful tools to compete effectively in the digital marketplace. Join us as we revolutionize the way small businesses operate online, bringing innovation and growth to local communities.


Job Description

We are seeking a skilled and experienced Data Engineer to join our team. In this role, you will develop systems on cloud platforms capable of processing millions of interactions daily, leveraging the latest cloud computing and machine learning technologies while creating custom in-house data solutions. The ideal candidate should have hands-on experience with SQL, PL/SQL, and any standard ETL tools. You must be able to thrive in a fast-paced environment and possess a strong passion for coding and problem-solving.


Required Skills and Experience

  • Minimum 5 years of experience in software development.
  • 3+ years of experience in data management and SQL expertise – PL/SQL, Teradata, and Snowflake experience strongly preferred.
  • Expertise in big data technologies such as Hadoop, HiveQL, and Spark (Scala/Python).
  • Expertise in cloud technologies – AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR).
  • Experience with queuing systems (e.g., SQS, Kafka) and caching systems (e.g., Ehcache, Memcached).
  • Experience with container management tools (e.g., Docker Swarm, Kubernetes).
  • Familiarity with data stores, including at least one of the following: Postgres, MongoDB, Cassandra, or Redis.
  • Ability to create advanced visualizations and dashboards to communicate complex findings (e.g., Looker Studio, Power BI, Tableau).
  • Strong skills in manipulating and transforming complex datasets for in-depth analysis.
  • Technical proficiency in writing code in Python and advanced SQL queries.
  • Knowledge of AI/ML infrastructure, best practices, and tools is a plus.
  • Experience in analyzing and resolving code issues.
  • Hands-on experience with software architecture concepts such as Separation of Concerns (SoC) and micro frontends with theme packages.
  • Proficiency with the Git version control system.
  • Experience with Agile development methodologies.
  • Strong problem-solving skills and the ability to learn quickly.
  • Exposure to Docker and Kubernetes.
  • Familiarity with AWS or other cloud platforms.


Responsibilities

  • Develop and maintain our inhouse search and reporting platform
  • Create data solutions to complement core products to improve performance and data quality
  • Collaborate with the development team to design, develop, and maintain our suite of products.
  • Write clean, efficient, and maintainable code, adhering to coding standards and best practices.
  • Participate in code reviews and testing to ensure high-quality code.
  • Troubleshoot and debug application issues as needed.
  • Stay up-to-date with emerging trends and technologies in the development community.


How to apply?

  • If you are passionate about designing user-centric products and want to be part of a forward-thinking company, we would love to hear from you. Please send your resume, a brief cover letter outlining your experience and your current CTC (Cost to Company) as a part of the application.


Join us in shaping the future of e-commerce!

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Bengaluru (Bangalore), Mumbai
5 - 10 yrs
Best in industry
skill iconJava
06692
Microservices
skill iconAmazon Web Services (AWS)
Apache Kafka
+1 more

Job Description: We are looking for a talented and motivated Software Engineer with

expertise in both Windows and Linux operating systems and solid experience in Java

technologies. The ideal candidate should be proficient in data structures and algorithms, as

well as frameworks like Spring MVC, Spring Boot, and Hibernate. Hands-on experience

working with MySQL databases is also essential for this role.


Responsibilities:

● Design, develop, test, and maintain software applications using Java technologies.

● Implement robust solutions using Spring MVC, Spring Boot, and Hibernate frameworks.

● Develop and optimize database operations with MySQL.

● Analyze and solve complex problems by applying knowledge of data structures and

algorithms.

● Work with both Windows and Linux environments to develop and deploy solutions.

● Collaborate with cross-functional teams to deliver high-quality products on time.

● Ensure application security, performance, and scalability.

● Maintain thorough documentation of technical solutions and processes.

● Debug, troubleshoot, and upgrade legacy systems when required.

Requirements:

● Operating Systems: Expertise in Windows and Linux environments.

● Programming Languages & Technologies: Strong knowledge of Java (Core Java, Java 8+).

● Frameworks: Proficiency in Spring MVC, Spring Boot, and Hibernate.

● Algorithms and Data Structures: Good understanding and practical application of DSA

concepts.

● Databases: Experience with MySQL – writing queries, stored procedures, and performance

tuning.

● Version Control Systems: Experience with tools like Git.

● Deployment: Knowledge of CI/CD pipelines and tools such as Jenkins, Docker (optional)

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Deepa Shankar
Posted by Deepa Shankar
Bengaluru (Bangalore)
4 - 9 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
Multitasking
+2 more

Required Skillset

• Experience in Core Java 1.8 and above, Data Structures, OOPS, Multithreading, Algorithms, Collections, System Design, Unix/Linux. • Possess good architectural knowledge and be aware of enterprise application design patterns. • Should be able to analyze, design, develop and test complex, low-latency client-facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-volume server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills in Java. • Strong interpersonal, communication and analytical skills. • Should be able to express their design ideas and thoughts.


Job Brief-

• Understand product requirements and come up with solution approaches. • Build and enhance large scale domain centric applications. • Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Robin Silverster
Posted by Robin Silverster
Mumbai
5 - 12 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Apache Kafka
Microservices
Data Structures
+2 more

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
VenkataRamanan S
Posted by VenkataRamanan S
Bengaluru (Bangalore)
3 - 8 yrs
Best in industry
06692
Microservices
Apache Kafka

Key Responsibilities:

  • Design, develop, and maintain robust and scalable backend applications using Core Java and Spring Boot.
  • Build and manage microservices-based architectures and ensure smooth inter-service communication.
  • Integrate and manage real-time data streaming using Apache Kafka.
  • Write clean, maintainable, and efficient code following best practices.
  • Collaborate with cross-functional teams including QA, DevOps, and product management.
  • Participate in code reviews and provide constructive feedback.
  • Troubleshoot, debug, and optimize applications for performance and scalability.

Required Skills:

  • Strong knowledge of Core Java (Java 8 or above).
  • Hands-on experience with Spring Boot and the Spring ecosystem (Spring MVC, Spring Data, Spring Security).
  • Experience in designing and developing RESTful APIs.
  • Solid understanding of Microservices architecture and related patterns.
  • Practical experience with Apache Kafka for real-time data processing.
  • Familiarity with SQL/NoSQL databases such as MySQL, PostgreSQL, or MongoDB.
  • Good understanding of CI/CD tools and practices.
  • Knowledge of containerization tools like Docker is a plus.
  • Strong problem-solving skills and attention to detail.


Read more
One Impression

at One Impression

1 video
4 recruiters
Achin Sood
Posted by Achin Sood
Gurugram
1 - 3 yrs
Best in industry
Problem solving
Data Structures
MySQL
skill iconMongoDB
DynamoDB
+9 more

We are looking for passionate people who love solving interesting and complex technology challenges, who are enthusiastic about building an industry first innovative product to solve new age real world problems. This role requires strategic leadership, the ability to manage complex technical challenges, and the ability to drive innovation while ensuring operational excellence. As a Backend SDE-2, you will collaborate with key stakeholders across the business, product management, and operations to ensure alignment with the organization's goals and play a critical role in shaping the technology roadmap and engineering culture.


Key Responsibilities


  • Strategic Planning: Work closely with senior leadership to develop and implement engineering strategies that support business objectives. Understand broader organization goals and prepare technology roadmaps.
  • Technical Excellence: Guide the team in designing and implementing scalable, extensible and secure software systems. Drive the adoption of best practices in technical architecture, coding standards, and software testing to ensure product delivery with highest speed AND quality.
  • Project and Program Management: Setting up aggressive as well as realistic timelines with all the stakeholders, ensure the successful delivery of engineering projects as per the defined timelines with best quality standards ensuring budget constraints are met. Use agile methodologies to manage the development process and resolve bottlenecks.
  • Cross-functional collaboration: Collaborate with Product Management, Design, Business, and Operations teams to define project requirements and deliverables. Ensure the smooth integration of engineering efforts across the organization.
  • Risk Management: Anticipate and mitigate technical risks and roadblocks. Proactively identify areas of technical debt and drive initiatives to reduce it.


Required Qualifications


  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 1-3 years of experience in software engineering
  • Excellent problem-solving skills, with the ability to diagnose and resolve complex technical challenges.
  • Proven track record of successfully delivering large-scale, high-impact software projects.
  • Strong understanding of software design principles and patterns.
  • Expertise in multiple programming languages and modern development frameworks.
  • Experience with cloud infrastructure (AWS), microservices, and distributed systems.
  • Experience with releational and non-relational databases.
  • Experience with Redis, ElasticSearch.
  • Experience in DevOps, CI/CD pipelines, and infrastructure automation.
  • Strong communication and interpersonal skills, with the ability to influence and inspire teams and stakeholders at all levels.


Skills:- MySQL, Python, Django, AWS, NoSQL, Kafka, Redis, ElasticSearch

Read more
Cloudesign Technology Solutions
Anshul Saxena
Posted by Anshul Saxena
Noida, Hyderabad, Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹30L / yr
Apache Kafka
Kafka Cluster
Schema Registry
Streaming
Kafka
+1 more

Job Title: Tech Lead and SSE – Kafka, Python, and Azure Databricks (Healthcare Data Project)

Experience: 4 to 12 years


Role Overview:

We are looking for a highly skilled Tech Lead with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams.


Key Responsibilities:

  • Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks.
  • Architect scalable data streaming and processing solutions to support healthcare data workflows.
  • Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data.
  • Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.).
  • Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions.
  • Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows.
  • Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering.
  • Stay updated with the latest cloud technologies, big data frameworks, and industry trends.


Required Skills & Qualifications:

  • 4+ years of experience in data engineering, with strong proficiency in Kafka and Python.
  • Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing.
  • Experience with Azure Databricks (or willingness to learn and adopt it quickly).
  • Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus).
  • Proficiency in SQL, NoSQL databases, and data modeling for big data processing.
  • Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications.
  • Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus.
  • Strong analytical skills, problem-solving mindset, and ability to lead complex data projects.
  • Excellent communication and stakeholder management skills.


Read more
One Impression

at One Impression

1 video
4 recruiters
Achin Sood
Posted by Achin Sood
Gurugram
0 - 2 yrs
Best in industry
Problem solving
Data Structures
MySQL
DynamoDB
skill iconMongoDB
+9 more

We're seeking passionate, next-gen minded engineers who are excited about solving complex technical challenges and building innovative, first-of-its-kind products which make a tangible difference for our customers. As a Backend SDE-1, you will play a key role in driving strategic initiatives, collaborating with cross-functional teams across business, product, and operations to solve exciting problems. This role demands strong technical acumen, leadership capabilities, and a mindset focused on innovation and operational excellence.

We value individuals who think independently, challenge the status quo, and bring creativity and curiosity to the table—not those who simply follow instructions. If you're passionate about solving problems and making an impact, we'd love to hear from you.


Key Responsibilities


  • Strategic Planning: Work closely with senior leadership to develop and implement engineering strategies that support business objectives. Understand broader organization goals and constantly prioritise your own work.
  • Technical Excellence: Understand the onground problems, explore and design various possible solutions to conclude and implement scalable, extensible and secure software systems. Implement and learn best practices in technical architecture, coding standards, and software testing to ensure product delivery with highest speed AND quality.
  • Project and Program Management: Setting up aggressive as well as realistic timelines with all the stakeholders, ensure the successful delivery of engineering projects as per the defined timelines with best quality standards ensuring budget constraints are met. Use agile methodologies to manage the development process and resolve bottlenecks.
  • Cross-functional collaboration: Collaborate with Product Managers, Design, Business, and Operations teams to define project requirements and deliverables. Ensure the smooth integration of engineering efforts across the organization.
  • Risk Management: Anticipate and mitigate technical risks and roadblocks. Proactively identify areas of technical debt and drive initiatives to reduce it.


Required Qualifications


  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • 1+ years of experience in software engineering
  • Excellent problem-solving skills, with the ability to diagnose and resolve complex technical challenges.
  • Strong understanding of software design principles and patterns.
  • Hands on with multiple programming languages and modern development frameworks.
  • Understanding of relational and non-relational databases.
  • Experience with Redis, ElasticSearch.
  • Strong communication and interpersonal skills, with the ability to influence and inspire teams and stakeholders at all levels.


Skills:- MySQL, Python, Django, AWS, NoSQL, Kafka, Redis, ElasticSearch

Read more
Zazmic
Remote only
9 - 12 yrs
₹10L - ₹15L / yr
skill iconPython
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
CI/CD
+5 more

Title: Senior Software Engineer – Python (Remote: Africa, India, Portugal)


Experience: 9 to 12 Years


INR : 40 LPA - 50 LPA


Location Requirement: Candidates must be based in Africa, India, or Portugal. Applicants outside these regions will not be considered.


Must-Have Qualifications:

  • 8+ years in software development with expertise in Python
  • kubernetes is important
  • Strong understanding of async frameworks (e.g., asyncio)
  • Experience with FastAPI, Flask, or Django for microservices
  • Proficiency with Docker and Kubernetes/AWS ECS
  • Familiarity with AWS, Azure, or GCP and IaC tools (CDK, Terraform)
  • Knowledge of SQL and NoSQL databases (PostgreSQL, Cassandra, DynamoDB)
  • Exposure to GenAI tools and LLM APIs (e.g., LangChain)
  • CI/CD and DevOps best practices
  • Strong communication and mentorship skills


Read more
Talent Pro
Bengaluru (Bangalore)
4 - 8 yrs
₹26L - ₹35L / yr
skill iconJava
skill iconSpring Boot
Google Cloud Platform (GCP)
Distributed Systems
Microservices
+3 more

Role & Responsibilities

Responsible for ensuring that the architecture and design of the platform remains top-notch with respect to scalability, availability, reliability and maintainability

Act as a key technical contributor as well as a hands-on contributing member of the team.

Own end-to-end availability and performance of features, driving rapid product innovation while ensuring a reliable service.

Working closely with the various stakeholders like Program Managers, Product Managers, Reliability and Continuity Engineering(RCE) team, QE team to estimate and execute features/tasks independently.

Maintain and drive tech backlog execution for non-functional requirements of the platform required to keep the platform resilient

Assist in release planning and prioritization based on technical feasibility and engineering constraints

A zeal to continually find new ways to improve architecture, design and ensure timely delivery and high quality.

Read more
Trika Tech
bhagya a
Posted by bhagya a
Bengaluru (Bangalore), Coimbatore
7 - 8 yrs
₹12L - ₹25L / yr
skill iconNodeJS (Node.js)
AWS Lambda
Apache Kafka
skill iconKubernetes

WHO WE ARE


We are a team of digital practitioners with roots stretching back to the earliest days of online commerce, who dedicate themselves to serving our client companies.

We’ve seen the advancements first-hand over the last 25 years and believe our experiences allow us to innovate. Utilizing cutting-edge technology and providing bespoke, innovative services, we believe we can help you stay ahead of the curve.

We take a holistic view of digital strategy. Our approach to transformation is based on conscious Intent to delight customers through continuous Insight and creative Innovation with an enduring culture of problem-solving.

We bring every element together to create innovative, high-performing commerce experiences for enterprise B2C, B2B, D2C and Marketplace brands across industries. From mapping out business and functional requirements, to developing the infrastructure to optimize traditionally fragmented processes, we help you create integrated, future-proofed commerce solutions.

 

WHAT YOU’LL BE DOING

As part of our team, you'll play a key role in building and evolving our Integration Platform as a Service (iPaaS) solution. This platform empowers our clients to seamlessly connect systems, automate workflows, and scale integrations with modern cloud-native tools.

Here’s what your day-to-day will look like:

  • Designing and Building Integrations
  • Collaborate with clients to understand integration needs and build scalable, event-driven solutions using Apache Kafka, AWS Lambda, API Gateway, and EventBridge.

  • Cloud-Native Development
  • Develop and deploy microservices and serverless functions using TypeScript (Node.js), hosted on Kubernetes (EKS) and fully integrated with core AWS services like S3, SQS, and SNS.

  • Managing Data Pipelines
  • Build robust data flows and streaming pipelines using Kafka and NoSQL databases like MongoDB, ensuring high availability and fault tolerance.

  • Client Collaboration
  • Work directly with customers to gather requirements, design integration patterns, and provide guidance on best practices for cloud-native architectures.

  • Driving Platform Evolution
  • Contribute to the ongoing improvement of our iPaaS platform—enhancing observability, scaling capabilities, and CI/CD processes using modern DevOps practices.


WHAT WE NEED IN YOU


  • Solid Experience in Apache Kafka for data streaming and event-driven systems
  • Production experience with Kubernetes (EKS) and containerized deployments
  • Deep knowledge of AWS, including S3, EC2, SQS, SNS, EventBridge, Lambda
  • Proficient in TypeScript (Node.js environment)
  • Experience with MongoDB or other NoSQL databases
  • Familiarity with microservices architecture, async messaging, and DevOps practices
  • AWS Certification (e.g., Solutions Architect or Developer Associate) is a plus


Qualification

  • Graduate - BE / Btech or equivalent.
  • 5 to 8 years of experience
  • Self motivated and quick learner with excellent problem solving skills. 
  • A good team player with nice communication skills.
  • Energy and real passion to work in a startup environment.

Visit our website - https://www.trikatechnologies.com



Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
3 - 6 yrs
₹8L - ₹13L / yr
skill iconAmazon Web Services (AWS)
Terraform
Ansible
skill iconDocker
Apache Kafka
+6 more

Must be:

  • Based in Mumbai
  • Comfortable with Work from Office
  • Available to join immediately


Responsibilities:

  • Manage, monitor, and scale production systems across cloud (AWS/GCP) and on-prem.
  • Work with Kubernetes, Docker, Lambdas to build reliable, scalable infrastructure.
  • Build tools and automation using Python, Go, or relevant scripting languages.
  • Ensure system observability using tools like NewRelic, Prometheus, Grafana, CloudWatch, PagerDuty.
  • Optimize for performance and low-latency in real-time systems using Kafka, gRPC, RTP.
  • Use Terraform, CloudFormation, Ansible, Chef, Puppet for infra automation and orchestration.
  • Load testing using Gatling, JMeter, and ensuring fault tolerance and high availability.
  • Collaborate with dev teams and participate in on-call rotations.


Requirements:

  • B.E./B.Tech in CS, Engineering or equivalent experience.
  • 3+ years in production infra and cloud-based systems.
  • Strong background in Linux (RHEL/CentOS) and shell scripting.
  • Experience managing hybrid infrastructure (cloud + on-prem).
  • Strong testing practices and code quality focus.
  • Experience leading teams is a plus.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Pune, Chennai
10 - 20 yrs
₹30L - ₹60L / yr
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
skill iconAmazon Web Services (AWS)
+8 more

📍 Position : Java Architect

📅 Experience : 10 to 15 Years

🧑‍💼 Open Positions : 3+

📍 Work Location : Bangalore, Pune, Chennai

💼 Work Mode : Hybrid

📅 Notice Period : Immediate joiners preferred; up to 1 month maximum

🔧 Core Responsibilities :

  • Lead architecture design and development for scalable enterprise-level applications.
  • Own and manage all aspects of technical development and delivery.
  • Define and enforce best coding practices, architectural guidelines, and development standards.
  • Plan and estimate the end-to-end technical scope of projects.
  • Conduct code reviews, ensure CI/CD, and implement TDD/BDD methodologies.
  • Mentor and lead individual contributors and small development teams.
  • Collaborate with cross-functional teams, including DevOps, Product, and QA.
  • Engage in high-level and low-level design (HLD/LLD), solutioning, and cloud-native transformations.

🛠️ Required Technical Skills :

  • Strong hands-on expertise in Java, Spring Boot, Microservices architecture
  • Experience with Kafka or similar messaging/event streaming platforms
  • Proficiency in cloud platformsAWS and Azure (must-have)
  • Exposure to frontend technologies (nice-to-have)
  • Solid understanding of HLD, system architecture, and design patterns
  • Good grasp of DevOps concepts, Docker, Kubernetes, and Infrastructure as Code (IaC)
  • Agile/Lean development, Pair Programming, and Continuous Integration practices
  • Polyglot mindset is a plus (Scala, Golang, Python, etc.)

🚀 Ideal Candidate Profile :

  • Currently working in a product-based environment
  • Already functioning as an Architect or Principal Engineer
  • Proven track record as an Individual Contributor (IC)
  • Strong engineering fundamentals with a passion for scalable software systems
  • No compromise on code quality, craftsmanship, and best practices

🧪 Interview Process :

  1. Round 1: Technical pairing round
  2. Rounds 2 & 3: Technical rounds with panel (code pairing + architecture)
  3. Final Round: HR and offer discussion
Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Bengaluru (Bangalore)
3 - 5 yrs
₹20L - ₹25L / yr
ETL
SQL
Apache Spark
Apache Kafka

Role & Responsibilities

About the Role:


We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.

Key responsibilities:


Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.

Implement efficient data modeling techniques to optimize performance and scalability of data systems.

Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.

Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.

Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.

Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.

Design and develop batch pipelines for scheduled data processing tasks.

Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.

Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.

Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.

Read more
Bengaluru (Bangalore)
6 - 9 yrs
₹30L - ₹60L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconPostgreSQL
Apache Kafka
+2 more

Role & Responsibilities

Lead the design, development, and deployment of complex, scalable, reliable, and highly available features for world-class SaaS products and services.

Guide the engineering team in adopting best practices for software development, code quality, and architecture.

Make strategic architectural and technical decisions, ensuring the scalability, security, and performance of software applications.

Proactively identify, prioritize, and address technical debt to improve system performance, maintainability, and long-term scalability, ensuring a solid foundation for future development.

Collaborate with cross-functional teams (product managers, designers, and stakeholders) to define project scope, requirements, and timelines.

Mentor and coach team members, providing technical guidance and fostering professional development.

Oversee code reviews, ensuring adherence to best practices and maintaining high code quality standards.

Drive continuous improvement in development processes, tools, and technologies to increase team productivity and product quality.

Stay updated with the latest industry trends and emerging technologies to drive innovation and keep the team at the cutting edge.

Ensure project timelines and goals are met, managing risks and resolving any technical challenges that arise during development.

Foster a collaborative and inclusive team culture, promoting open communication and problem-solving.

Imbibe and maintain a strong customer delight attitude while designing and building products.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Bengaluru (Bangalore)
5 - 8 yrs
₹30L - ₹45L / yr
skill iconSpring Boot
Spring
Microservices
skill iconJava
skill iconAmazon Web Services (AWS)
+7 more

What we Require


We are recruiting technical experts with the following core skills and hands-on experience on


Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot

Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes

Expert proficiency in designing both producer and consumer types of Rest services.

Expert proficiency in Unit testing and Code Quality tools.

Expert proficiency in ensuring code coverage.

Expert proficiency in understanding High-Level Design and translating that to Low-Level design

Hands-on experience working with no-SQL databases.

Experience working in an Agile development process - Scrum.

Experience working closely with engineers and software cultures.

Ability to think at a high level about product strategy and customer journeys.

Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.

Excellent communication skills to clearly articulate design decisions.

Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai, Navi Mumbai
5 - 40 yrs
₹8.5L - ₹75L / yr
Microservices
Architecture
API
NOSQL Databases
skill iconMongoDB
+33 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-Fri, In office role with excellent perks and benefits!


Key Responsibilities:

1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.

2. Build and implement scalable and robust microservices and integrate API gateways.

3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).

4. Implement real-time data pipelines using Kafka.

5. Collaborate with front-end developers to ensure seamless integration of backend services.

6. Write clean, reusable, and efficient code following best practices, including design patterns.

7. Troubleshoot, debug, and enhance existing systems for improved performance.


Mandatory Skills:

1. Proficiency in at least one backend technology: Node.js or Python, or Java.


2. Strong experience in:

i. Microservices architecture,

ii. API gateways,

iii. NoSQL databases (e.g., MongoDB, DynamoDB),

iv. Kafka

v. Data structures (e.g., arrays, linked lists, trees).


3. Frameworks:

i. If Java : Spring framework for backend development.

ii. If Python: FastAPI/Django frameworks for AI applications.

iii. If Node: Express.js for Node.js development.


Good to Have Skills:

1. Experience with Kubernetes for container orchestration.

2. Familiarity with in-memory databases like Redis or Memcached.

3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.

Read more
Wekan Enterprise Solutions

at Wekan Enterprise Solutions

2 candid answers
Deepak  N
Posted by Deepak N
Bengaluru (Bangalore), Chennai
12 - 22 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconMongoDB
Microservices
skill iconJavascript
TypeScript
+3 more

Architect


Experience - 12+ yrs


About Wekan Enterprise Solutions


Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.

 

Job Description

We are looking for passionate architects eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?

You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.

Location - Chennai or Bangalore


●     Relevant experience of 12+ years building high-performance applications with at least 3+ years as an architect.

●     Good problem solving skills

●     Strong mentoring capabilities

●     Good understanding of software development life cycle

●     Strong experience in system design and architecture

●     Strong focus on quality of work delivered

●     Excellent verbal and written communication skills

 

Required Technical Skills

 

● Extensive hands-on experience building high-performance applications using Node.Js (Javascript/Typescript) and .NET/ Golang / Java / Python.

● Strong experience with appropriate framework(s).

● Wellversed in monolithic and microservices architecture.

● Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases

● Experience working with 3rd party integrations ranging from authentication, cloud services, etc.

● Hands-on experience with Kafka or RabbitMQ.

● Handsonexperience with CI/CD pipelines and atleast 1 cloud provider- AWS / GCP / Azure

● Strong experience writing and maintaining clear documentation

  

Good to have skills:

 

●     Experience working with frontend technologies - React.Js or Vue.Js or Angular.

●     Extensive experience consulting with customers directly for defining architecture or system design.

●     Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
8 - 13 yrs
₹70L - ₹90L / yr
Data engineering
Apache Spark
Apache Kafka
skill iconJava
skill iconPython
+6 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
11 - 18 yrs
₹70L - ₹80L / yr
skill iconJava
skill iconGo Programming (Golang)
skill iconNodeJS (Node.js)
skill iconPython
Apache Kafka
+7 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
11 - 18 yrs
₹50L - ₹70L / yr
skill iconJava
Data engineering
skill iconNodeJS (Node.js)
skill iconPython
skill iconGo Programming (Golang)
+5 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Remote only
7 - 12 yrs
₹25L - ₹40L / yr
Spark
skill iconJava
Apache Kafka
Big Data
Apache Hive
+5 more

Job Title: Big Data Engineer (Java Spark Developer – JAVA SPARK EXP IS MUST)

Location: Chennai, Hyderabad, Pune, Bangalore (Bengaluru) / NCR Delhi

Client: Premium Tier 1 Company

Payroll: Direct Client

Employment Type: Full time / Perm

Experience: 7+ years

 

Job Description:

We are looking for a skilled Big Data Engineers using Java Spark with 7+ years of experience in Big Data / legacy platforms, who can join immediately. Desired candidate should have design, development and optimization of real-time & batch data pipelines experience in Big Data environment at an enterprise scale applications. You will work on building scalable and high-performance data processing solutions, integrating real-time data streams, and building a reliable Data platforms. Strong troubleshooting, performance tuning, and collaboration skills are key for this role.

 

Key Responsibilities:

·      Develop data pipelines using Java Spark and Kafka.

·      Optimize and maintain real-time data pipelines and messaging systems.

·      Collaborate with cross-functional teams to deliver scalable data solutions.

·      Troubleshoot and resolve issues in Java Spark and Kafka applications.

 

Qualifications:

·      Experience in Java Spark is must

·      Knowledge and hands-on experience using distributed computing, real-time data streaming, and big data technologies

·      Strong problem-solving and performance optimization skills

·      Looking for immediate joiners

 

If interested, please share your resume along with the following details

1)    Notice Period

2)    Current CTC

3)    Expected CTC

4)    Have Experience in Java Spark - Y / N (this is must)

5)    Any offers in hand

 

Thanks & Regards,

LION & ELEPHANTS CONSULTANCY PVT LTD TEAM

SINGAPORE | INDIA

 

Read more
Mernplus Technologies
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹17L / yr
skill iconJava
camunda
Apache Camel
Apache Kafka
karaf

We are seeking a skilled Java Developer with 5+ years of experience in Java, Camunda, Apache Camel, Kafka, and Apache Karaf. The ideal candidate should have expertise in workflow automation, message-driven architectures, and enterprise integration patterns. Strong problem-solving skills and hands-on experience in microservices and event-driven systems are required.

Read more
Gipfel & Schnell Consultings Pvt Ltd
Bengaluru (Bangalore)
5 - 12 yrs
Best in industry
DevOps
azure
Terraform
Powershell
Apache Kafka
+1 more

Mandatory Skills:


  • AZ-104 (Azure Administrator) experience
  • CI/CD migration expertise
  • Proficiency in Windows deployment and support
  • Infrastructure as Code (IaC) in Terraform
  • Automation using PowerShell
  • Understanding of SDLC for C# applications (build/ship/run strategy)
  • Apache Kafka experience
  • Azure web app


Good to Have Skills:


  • AZ-400 (Azure DevOps Engineer Expert)
  • AZ-700 Designing and Implementing Microsoft Azure Networking Solutions
  • Apache Pulsar
  • Windows containers
  • Active Directory and DNS
  • SAST and DAST tool understanding
  • MSSQL database
  • Postgres database
  • Azure security
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Noida
5 - 12 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Hibernate (Java)
Object Oriented Programming (OOPs)
Design patterns
+8 more

Position Title : Java Full Stack Developer

Location : Noida Sector 125

Experience : 5+ Years

Availability : Immediate


Job Summary :

We are looking for a Java Full Stack Developer with expertise in Microservices architecture to join our team.

The ideal candidate should have hands-on experience in Java, Spring Boot, Hibernate, and front-end technologies like Angular, JavaScript, and Bootstrap. You will work on enterprise-grade applications that enhance patient safety worldwide.


Key Responsibilities :

  • Design, develop, and maintain applications based on Microservices architecture.
  • Work with Java, Spring Boot, Hibernate, Angular, Kafka, Redis, and Hazelcast to build scalable solutions.
  • Utilize AWS, Git, Nginx, Tomcat, Oracle, Jira, Confluence, and Jenkins for development and deployment.
  • Collaborate with cross-functional teams to develop enterprise applications.
  • Develop intuitive UI/UX components using Bootstrap, jQuery, and JavaScript.
  • Ensure applications meet performance, scalability, and security requirements.
  • Participate in Agile development while efficiently handling changing priorities.
  • Conduct code reviews, debugging, and performance optimization.

Required Skills & Qualifications :

5+ Years of hands-on experience in Java 7/8, Spring Boot, and Hibernate.

✔ Strong understanding of OOP concepts and Design Patterns.

✔ Experience working with relational databases like Oracle/MySQL.

✔ Proficiency in Bootstrap, JavaScript, jQuery, HTML, and Angular.

✔ Hands-on experience in Microservices-based application development.

✔ Strong problem-solving, debugging, and analytical skills.

✔ Excellent communication and collaboration skills.

✔ Ability to adapt to new technologies and handle multiple priorities.

✔ Experience in developing high-quality web applications.


Good to Have :

➕ Exposure to Kafka, Redis, and Hazelcast.

➕ Experience working with cloud-based solutions (AWS preferred).

➕ Familiarity with DevOps tools like Jenkins, Docker, and Kubernetes.


Why Join Us?

✅ Work on cutting-edge technologies and enterprise-level applications.

✅ Collaborative and innovative work environment.

✅ Competitive salary and career growth opportunities.

Read more
ZyBiSys

at ZyBiSys

4 candid answers
8 recruiters
Subash S
Posted by Subash S
Tiruchirappalli, tamilnadu
5 - 10 yrs
₹20L - ₹25L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconMongoDB
skill iconGo Programming (Golang)
Nginx
+14 more

Job Role: Senior Full Stack Developer

Location: Trichy

Job Type: Full Time

Experience Required: 5+ Years

Reporting to : Product Head


About Us:


At Zybisys Consulting Services LLP, we are a leading company in Cloud Managed Services and Cloud Computing. We believe in creating a vibrant and inclusive workplace where talented people can grow and succeed. We are looking for a dedicated leader who is passionate about supporting our team, developing talent, and enhancing our company culture.


Role Overview:

Are you a seasoned Full Stack Developer with a passion for crafting innovative solutions? We are looking for an experienced Senior Full Stack Developer to enhance our team and lead the development of innovative solutions.


Key Responsibilities:

  • Develop and Maintain Applications: Design, develop, and maintain scalable and efficient full-stack applications using modern technologies.
  • Database Design: Expertise in both relational and NoSQL databases, including schema design, query optimization, and data modeling.
  • Collaborate with Teams: Work closely with front-end and back-end developers along with the Engineering team to integrate and optimize APIs and services.
  • Implement Best Practices: Ensure high-quality code, adherence to best practices, and efficient use of technologies.
  • Troubleshoot and Debug: Identify and resolve complex issues, providing solutions and improvements.
  • Code Review and Quality Assurance: Skill in reviewing code, ensuring adherence to coding standards, and implementing best practices for software quality.
  • Agile Methodologies: Experience with Agile frameworks (e.g., Scrum, Kanban) to facilitate iterative development and continuous improvement.
  • Test-Driven Development (TDD): Knowledge of TDD practices, writing unit tests, and integrating automated testing (CI/CD) into the development workflow.
  • Technical Documentation: Ability to write clear and concise technical documentation for codebases, APIs, and system architecture.


Technical Skills:

  • Backend: Node.js, Express.js, Python, Golang, gRPC
  • Frontend: React.js, Next.js, HTML, HTML5, CSS3, jQuery
  • Database: MongoDB, MySQL, Redis, OpenSearch
  • API : RESTful APIs, SOAP services, or GraphQL
  • Tools & Technologies: Docker, Git, Kafka
  • Design & Development: Figma, Linux
  • Containers & container orchestration: Docker, Kubernetes
  • Networking & OS Knowledge


What We Offer:

  • Growth Opportunities: Expand your skills and career within a forward-thinking company.
  • Collaborative Environment: Join a team that values innovation and teamwork.


If your ready to take on exciting challenges and work in a collaborative environment, wed love to hear from you!


Apply now to join our team as a Senior Full Stack Developer and make waves with your skills!

Read more
Intellikart Ventures LLP
Prajwal Shinde
Posted by Prajwal Shinde
Pune
2 - 5 yrs
₹9L - ₹15L / yr
PowerBI
SQL
ETL
snowflake
Apache Kafka
+1 more

Experience: 4+ years.

Location: Vadodara & Pune

Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines

What you'll be doing:

  • Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
  • Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
  • Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
  • Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
  • Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
  • Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
  • Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
  • Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
  • Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.


What you need:

Basic Skills:


  • 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
  • Strong experience with Apache Kafka for stream processing and real-time data integration.
  • Proficiency in SQL and ETL/ELT processes.
  • Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
  • Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
  • Knowledge of data governance, security, and compliance best practices.
  • Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
  • Ability to work in a collaborative team environment and communicate effectively with cross-functional teams


Responsibilities:

  • Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
  • Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
  • Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
  • Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
  • Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
  • Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
  • Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
  • Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
  • Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
  • Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform. 


Read more
Chennai
5 - 7 yrs
₹15L - ₹25L / yr
Apache Kafka
Google Cloud Platform (GCP)
BCP
DevOps

Job description

 Location: Chennai, India

 Experience: 5+ Years

 Certification: Kafka Certified (Mandatory); Additional Certifications are a Plus


Job Overview:

We are seeking an experienced DevOps Engineer specializing in GCP Cloud Infrastructure Management and Kafka Administration. The ideal candidate should have 5+ years of experience in cloud technologies, Kubernetes, and Kafka, with a mandatory Kafka certification.


Key Responsibilities:

Cloud Infrastructure Management:

· Manage and update Kubernetes (K8s) on GKE.

· Monitor and optimize K8s resources, including pods, storage, memory, and costs.

· Oversee the general monitoring and maintenance of environments using:

o OpenSearch / Kibana

o KafkaUI

o BGP

o Grafana / Prometheus


Kafka Administration:

· Manage Kafka brokers and ACLs.

· Hands-on experience in Kafka administration (preferably Confluent Kafka).

· Independently debug, optimize, and implement Kafka solutions based on developer and business needs.


Other Responsibilities:

· Perform random investigations to troubleshoot and enhance infrastructure.

· Manage PostgreSQL databases efficiently.

· Administer Jenkins pipelines, supporting CI/CD implementation and maintenance.


Required Skills & Qualifications:

· Kafka Certified Engineer (Mandatory).

· 5+ years of experience in GCP DevOps, Cloud Infrastructure, and Kafka Administration.

· Strong expertise in Kubernetes (K8s), Google Kubernetes Engine (GKE), and cloud environments.

· Hands-on experience with monitoring tools like Grafana, Prometheus, OpenSearch, and Kibana.

· Experience managing PostgreSQL databases.

· Proficiency in Jenkins pipeline administration.

· Ability to work independently and collaborate with developers and business stakeholders.

If you are passionate about DevOps, Cloud Infrastructure, and Kafka, and meet the above qualifications, we encourage you to apply!


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
5 - 12 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+17 more

Job Title : Senior AWS Data Engineer

Experience : 5+ Years

Location : Gurugram

Employment Type : Full-Time

Job Summary :

Seeking a Senior AWS Data Engineer with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.

Key Responsibilities :

  • Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
  • Maintain data lakes & warehouses for analytics.
  • Ensure data integrity through quality checks.
  • Collaborate with data scientists & engineers to deliver solutions.

Qualifications :

  • 7+ Years in Data Engineering.
  • Expertise in AWS services, SQL, Python, Spark, Kafka.
  • Experience with CI/CD, DevOps practices.
  • Strong problem-solving skills.

Preferred Skills :

  • Experience with Snowflake, Databricks.
  • Knowledge of BI tools (Tableau, Power BI).
  • Healthcare/Insurance domain experience is a plus.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
7 - 15 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+20 more

Job Title : Tech Lead - Data Engineering (AWS, 7+ Years)

Location : Gurugram

Employment Type : Full-Time


Job Summary :

Seeking a Tech Lead - Data Engineering with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.


Key Responsibilities :

  • Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
  • Maintain data lakes & warehouses for analytics.
  • Ensure data integrity through quality checks.
  • Collaborate with data scientists & engineers to deliver solutions.

Qualifications :

  • 7+ Years in Data Engineering.
  • Expertise in AWS services, SQL, Python, Spark, Kafka.
  • Experience with CI/CD, DevOps practices.
  • Strong problem-solving skills.

Preferred Skills :

  • Experience with Snowflake, Databricks.
  • Knowledge of BI tools (Tableau, Power BI).
  • Healthcare/Insurance domain experience is a plus.


Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 8 yrs
₹2L - ₹15L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more

Job Title : Full Stack Developer (Python + React.js)

Location : Gurgaon (Work From Office, 6 days a week)

Experience : 3+ Years


Job Overview :

We are looking for a skilled Full Stack Developer proficient in Python (Django) and React.js to develop scalable web applications. The ideal candidate must have experience in backend and frontend development, database management, and cloud technologies.


Mandatory Skills :

Python, Django (Backend Development)

PostgreSQL (Database Management)

AWS (Cloud Services)

RabbitMQ, Redis, Kafka, Celery (Messaging & Asynchronous Processing)

React.js (Frontend Development)


Key Requirements :

  • 3+ Years of experience in Full Stack Development.
  • Strong expertise in RESTful APIs & Microservices.
  • Experience with CI/CD, Git, and Agile methodologies.
  • Strong problem-solving and communication skills.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
3 - 8 yrs
₹3L - ₹10L / yr
skill iconPython
skill iconDjango
skill iconFlask
MySQL
skill iconPostgreSQL
+6 more

Job Title : Python Django Developer

Location : Gurgaon (On-site)

Work Mode : 6 Days a Week (Work from Office)

Experience Level : 3+ Years


About the Role :

We are seeking a highly skilled and motivated Python Django Developer to join our team in Gurgaon. This role requires a hands-on developer with expertise in building scalable web applications and APIs using Python and Django. The ideal candidate will have a strong background in relational databases, message brokers, and distributed systems.


Key Responsibilities :

  • Design, develop, and maintain robust, scalable, and secure web applications using Python and Django.
  • Build and optimize back-end services, RESTful APIs, and integrations with third-party tools.
  • Implement and maintain asynchronous task processing using Celery and RabbitMQ.
  • Work with PostgreSQL to design and optimize database schemas and queries.
  • Utilize Redis and Kafka for caching, data streaming, and other distributed system needs.
  • Debug and troubleshoot issues across the application stack.
  • Collaborate with cross-functional teams to gather requirements and deliver solutions.
  • Ensure code quality through comprehensive testing, code reviews, and adherence to best practices.


Required Skills and Qualifications:

Technical Expertise:

  • Proficiency in Python and strong experience with Django framework.
  • Hands-on experience with PostgreSQL for database design and management.
  • Familiarity with RabbitMQCelery, and Redis for asynchronous processing and caching.
  • Experience with Kafka for building real-time data pipelines and event-driven architectures.

Other Skills:

  • Strong understanding of software development best practices and design patterns.
  • Proficiency in writing efficient, reusable, and testable code.
  • Good knowledge of Linux/Unix environments.
  • Familiarity with Docker and containerized deployments is a plus.

Soft Skills:

  • Excellent problem-solving and analytical skills.
  • Good communication and teamwork abilities.
  • Ability to work independently and in a collaborative team environment.

Preferred Qualifications:

  • Experience in microservices architecture.
  • Exposure to DevOps tools and practices.
  • Knowledge of front-end technologies like React or Angular is a bonus.
Read more
Rigel Networks Pvt Ltd
Minakshi Soni
Posted by Minakshi Soni
Bengaluru (Bangalore), Pune, Mumbai, Chennai
8 - 12 yrs
₹8L - ₹10L / yr
skill iconAmazon Web Services (AWS)
Terraform
Amazon Redshift
Redshift
Snowflake
+16 more

Dear Candidate,


We are urgently Hiring AWS Cloud Engineer for Bangalore Location.

Position: AWS Cloud Engineer

Location: Bangalore

Experience: 8-11 yrs

Skills: Aws Cloud

Salary: Best in Industry (20-25% Hike on the current ctc)

Note:

only Immediate to 15 days Joiners will be preferred.

Candidates from Tier 1 companies will only be shortlisted and selected

Candidates' NP more than 30 days will get rejected while screening.

Offer shoppers will be rejected.


Job description:

 

Description:

 

Title: AWS Cloud Engineer

Prefer BLR / HYD – else any location is fine

Work Mode: Hybrid – based on HR rule (currently 1 day per month)


Shift Timings 24 x 7 (Work in shifts on rotational basis)

Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.

Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting



Experience and Skills Requirements:


Experience:

8 years of experience in a technical role working with AWS


Mandatory

Technical troubleshooting and problem solving

AWS management of large-scale IaaS PaaS solutions

Cloud networking and security fundamentals

Experience using containerization in AWS

Working Data warehouse knowledge Redshift and Snowflake preferred

Working with IaC – Terraform and Cloud Formation

Working understanding of scripting languages including Python and Shell

Collaboration and communication skills

Highly adaptable to changes in a technical environment

 

Optional

Experience using monitoring and observer ability toolsets inc. Splunk, Datadog

Experience using Github Actions

Experience using AWS RDS/SQL based solutions

Experience working with streaming technologies inc. Kafka, Apache Flink

Experience working with a ETL environments

Experience working with a confluent cloud platform


Certifications:


Minimum

AWS Certified SysOps Administrator – Associate

AWS Certified DevOps Engineer - Professional



Preferred


AWS Certified Solutions Architect – Associate


Responsibilities:


Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.


The following is a list of expected responsibilities:


To manage and support a customer’s AWS platform

To be technical hands on

Provide Incident and Problem management on the AWS IaaS and PaaS Platform

Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner

Actively monitor an AWS platform for technical issues

To be involved in the resolution of technical incidents tickets

Assist in the root cause analysis of incidents

Assist with improving efficiency and processes within the team

Examining traces and logs

Working with third party suppliers and AWS to jointly resolve incidents


Good to have:


Confluent Cloud

Snowflake




Best Regards,

Minakshi Soni

Executive - Talent Acquisition (L2)

Rigel Networks

Worldwide Locations: USA | HK | IN 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort