Cutshort logo
Apache kafka jobs

50+ Apache Kafka Jobs in India

Apply to 50+ Apache Kafka Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Kafka Jobs and apply today!

icon
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
6 - 9 yrs
Upto ₹25L / yr (Varies
)
Data Warehouse (DWH)
SQL
ETL
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
+3 more

We’re looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization.


Responsibilities:

  • Lead the design of data warehouses, lakes, and ETL workflows.
  • Collaborate with teams to gather requirements and build scalable solutions.
  • Ensure data governance, security, and optimal performance of systems.
  • Mentor junior engineers and drive end-to-end project delivery.

Requirements:

  • 6+ years of experience in data engineering, including at least 2 full-cycle data warehouse projects.
  • Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms.
  • Expertise in big data tools (e.g., Apache Spark, Kafka).
  • Excellent communication skills and leadership abilities.

Preferred: Experience with workflow orchestration tools (e.g., Airflow), real-time data, and DataOps practices.

Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Indore, Ahmedabad
9 - 15 yrs
₹25L - ₹38L / yr
skill iconJava
J2EE
Microservices
Apache Kafka
SQL
+17 more

We are in search of a proficient Java Principal Engineer with a minimum of 10 years' experience in designing and developing Java applications. The ideal candidate will demonstrate a deep understanding of Java technologies, including Java EE, Spring Framework, and Hibernate. Proficiency in database technologies such as MySQL, Oracle, or PostgreSQL is essential, along with a proven track record of delivering high-quality, scalable, and efficient Java solutions.



We are looking for you!

You are a team player, get-it-done person, intellectually curious, customer focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You have the zeal to think differently, understand that career is a journey and make the choices right. You must have experience in creating visually compelling designs that effectively communicate our message and engage our target audience. Ideal candidates would be someone who is creative, proactive, go getter and motivated to look for ways to add value to job accomplishments.


As an ideal candidate for the Java Lead position, you bring a wealth of experience and expertise in Java development, combined with strong leadership qualities. Your proven track record showcases your ability to lead and mentor teams to deliver high-quality, enterprise-grade applications.


Your technical proficiency and commitment to excellence make you a valuable asset in driving innovation and success within our development projects. You possess a team-oriented mindset and a "get-it-done" attitude, inspiring your team members to excel and collaborate effectively. 


You have a proven ability to lead mid to large size teams, emphasizing a quality-first approach and ensuring that projects are delivered on time and within scope. As a Java Lead, you are responsible for overseeing project planning, implementing best practices, and driving technical solutions that align with business objectives.


You collaborate closely with development managers, architects, and cross-functional teams to design scalable and robust Java applications.

Your proactive nature and methodical approach enable you to identify areas for improvement, mentor team members, and foster a culture of continuous learning and growth.


Your leadership style, technical acumen, and dedication to delivering excellence make you an ideal candidate to lead our Java development initiatives and contribute significantly to the success and innovation of our organization.


What You Will Do: 

  • Design and development of RESTful Web Services.  
  • Hands on database experience (Oracle / PostgreSQL / MySQL /SQL Server).  
  • Hands on experience with developing web applications leveraging Spring Framework.  
  • Hands on experience with developing microservices leveraging Spring Boot.  
  • Experience with cloud platforms (e.g., AWS, Azure) and containerization technologies.   
  • Continuous Integration tools (Jenkins & Git Lab), CICD Tools. 
  • Strong believer and follower of agile methodologies with an emphasis on Quality & Standards based development. 
  • Architect, design, and implement complex software systems using [Specify relevant technologies, e.g., Java, Python, Node.js. 


What we need?

  • BTech computer science or equivalent  
  • Minimum 10+ years of relevant experience in Java/J2EE technologies  
  • Experience in building back in API using Spring Boot Framework, Spring DI, Spring AOP  
  • Real time messaging integration using Kafka or similar framework  
  • Experience in at least one database: Oracle, SQL server or PostgreSQL
  • Previous experience managing and leading high-performing software engineering teams.   


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Read more
 Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad
7 - 15 yrs
₹20L - ₹30L / yr
skill iconJava
J2EE
skill iconSpring Boot
06692
skill iconAmazon Web Services (AWS)
+12 more

As a Lead Java Developer, you will take charge of driving the development and delivery of high-quality Java-based applications. Your role will involve leading a team of developers, providing technical guidance, and overseeing the entire software development life cycle. With your deep understanding of Java programming and related frameworks, you will design and implement scalable and efficient solutions that meet the project requirements. Your strong problem-solving skills and attention to detail will ensure the code quality and performance of the applications. Additionally, you will stay updated with the latest industry trends and best practices to improve the development processes continuously and contribute to the success of the team.


What You Will Do: 

  • Design and development of RESTful Web Services.
  • Hands on database experience (Oracle / PostgreSQL / MySQL /SQL Server).
  • Hands on experience with developing web applications leveraging Spring Framework.
  • Hands on experience with developing microservices leveraging Spring Boot.
  • Experience with cloud platforms (e.g., AWS, Azure) and containerization technologies.
  • Continuous Integration tools (Jenkins & Git Lab), CICD Tools.
  • Strong believer and follower of agile methodologies with an emphasis on Quality & Standards based development.
  • Architect, design, and implement complex software systems using [Specify relevant technologies, e.g., Java, Python, Node.js.


What we need?

  • BTech computer science or equivalent
  • Minimum 8+ years of relevant experience in Java/J2EE technologies
  • Experience in building back in API using Spring Boot Framework, Spring DI, Spring AOP
  • Real time messaging integration using Kafka or similar framework
  • Experience in at least one database: Oracle, SQL server or PostgreSQL
  • Previous experience managing and leading high-performing software engineering teams.


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
3 - 6 yrs
₹8L - ₹13L / yr
skill iconAmazon Web Services (AWS)
Terraform
Ansible
skill iconDocker
Apache Kafka
+6 more

Must be:

  • Based in Mumbai
  • Comfortable with Work from Office
  • Available to join immediately


Responsibilities:

  • Manage, monitor, and scale production systems across cloud (AWS/GCP) and on-prem.
  • Work with Kubernetes, Docker, Lambdas to build reliable, scalable infrastructure.
  • Build tools and automation using Python, Go, or relevant scripting languages.
  • Ensure system observability using tools like NewRelic, Prometheus, Grafana, CloudWatch, PagerDuty.
  • Optimize for performance and low-latency in real-time systems using Kafka, gRPC, RTP.
  • Use Terraform, CloudFormation, Ansible, Chef, Puppet for infra automation and orchestration.
  • Load testing using Gatling, JMeter, and ensuring fault tolerance and high availability.
  • Collaborate with dev teams and participate in on-call rotations.


Requirements:

  • B.E./B.Tech in CS, Engineering or equivalent experience.
  • 3+ years in production infra and cloud-based systems.
  • Strong background in Linux (RHEL/CentOS) and shell scripting.
  • Experience managing hybrid infrastructure (cloud + on-prem).
  • Strong testing practices and code quality focus.
  • Experience leading teams is a plus.
Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
2 - 4 yrs
Best in industry
AWS Lambda
databricks
Database migration
Apache Kafka
Apache Spark
+3 more

About NonStop io Technologies:

NonStop io Technologies is a value-driven company with a strong focus on process-oriented software engineering. We specialize in Product Development and have a decade's worth of experience in building web and mobile applications across various domains. NonStop io Technologies follows core principles that guide its operations and believes in staying invested in a product's vision for the long term. We are a small but proud group of individuals who believe in the 'givers gain' philosophy and strive to provide value in order to seek value. We are committed to and specialize in building cutting-edge technology products and serving as trusted technology partners for startups and enterprises. We pride ourselves on fostering innovation, learning, and community engagement. Join us to work on impactful projects in a collaborative and vibrant environment.

Brief Description:

We are looking for a talented Data Engineer to join our team. In this role, you will design, implement, and manage data pipelines, ensuring the accessibility and reliability of data for critical business processes. This is an exciting opportunity to work on scalable solutions that power data-driven decisions

Skillset:

Here is a list of some of the technologies you will work with (the list below is not set in stone)

Data Pipeline Orchestration and Execution:

● AWS Glue

● AWS Step Functions

● Databricks Change

Data Capture:

● Amazon Database Migration Service

● Amazon Managed Streaming for Apache Kafka with Debezium Plugin

Batch:

● AWS step functions (and Glue Jobs)

● Asynchronous queueing of batch job commands with RabbitMQ to various “ETL Jobs”

● Cron and subervisord processing on dedicated job server(s): Python & PHP

Streaming:

● Real-time processing via AWS MSK (Kafka), Apache Hudi, & Apache Flink

● Near real-time processing via worker (listeners) spread over AWS Lambda, custom server (daemons) written in Python and PHP Symfony

● Languages: Python & PySpark, Unix Shell, PHP Symfony (with Doctrine ORM)

● Monitoring & Reliability: Datadog & Cloudwatch

Things you will do:

● Build dashboards using Datadog and Cloudwatch to ensure system health and user support

● Build schema registries that enable data governance

● Partner with end-users to resolve service disruptions and evangelize our data product offerings

● Vigilantly oversee data quality and alert upstream data producers of issues

● Support and contribute to the data platform architecture strategy, roadmap, and implementation plans to support the company’s data-driven initiatives and business objective

● Work with Business Intelligence (BI) consumers to deliver enterprise-wide fact and dimension data product tables to enable data-driven decision-making across the organization.

● Other duties as assigned

Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Remote, Bengaluru (Bangalore), Pune, Chennai
10 - 15 yrs
₹10L - ₹55L / yr
skill iconJava
skill iconSpring Boot
skill iconAmazon Web Services (AWS)
Microservices
High-level design
+5 more

We are looking for a highly skilled Solution Architect with a passion for software engineering and deep experience in backend technologies, cloud, and DevOps. This role will be central in managing, designing, and delivering large-scale, scalable solutions.


Core Skills

  • Strong coding and software engineering fundamentals.
  • Experience in large-scale custom-built applications and platforms.
  • Champion of SOLID principles, OO design, and pair programming.
  • Agile, Lean, and Continuous Delivery – CI, TDD, BDD.
  • Frontend experience is a plus.
  • Hands-on with Java, Scala, Golang, Rust, Spark, Python, and JS frameworks.
  • Experience with Docker, Kubernetes, and Infrastructure as Code.
  • Excellent understanding of cloud technologies – AWS, GCP, Azure.


Responsibilities

  • Own all aspects of technical development and delivery.
  • Understand project requirements and create architecture documentation.
  • Ensure adherence to development best practices through code reviews.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Pune, Chennai
10 - 20 yrs
₹30L - ₹60L / yr
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
skill iconAmazon Web Services (AWS)
+8 more

📍 Position : Java Architect

📅 Experience : 10 to 15 Years

🧑‍💼 Open Positions : 3+

📍 Work Location : Bangalore, Pune, Chennai

💼 Work Mode : Hybrid

📅 Notice Period : Immediate joiners preferred; up to 1 month maximum

🔧 Core Responsibilities :

  • Lead architecture design and development for scalable enterprise-level applications.
  • Own and manage all aspects of technical development and delivery.
  • Define and enforce best coding practices, architectural guidelines, and development standards.
  • Plan and estimate the end-to-end technical scope of projects.
  • Conduct code reviews, ensure CI/CD, and implement TDD/BDD methodologies.
  • Mentor and lead individual contributors and small development teams.
  • Collaborate with cross-functional teams, including DevOps, Product, and QA.
  • Engage in high-level and low-level design (HLD/LLD), solutioning, and cloud-native transformations.

🛠️ Required Technical Skills :

  • Strong hands-on expertise in Java, Spring Boot, Microservices architecture
  • Experience with Kafka or similar messaging/event streaming platforms
  • Proficiency in cloud platformsAWS and Azure (must-have)
  • Exposure to frontend technologies (nice-to-have)
  • Solid understanding of HLD, system architecture, and design patterns
  • Good grasp of DevOps concepts, Docker, Kubernetes, and Infrastructure as Code (IaC)
  • Agile/Lean development, Pair Programming, and Continuous Integration practices
  • Polyglot mindset is a plus (Scala, Golang, Python, etc.)

🚀 Ideal Candidate Profile :

  • Currently working in a product-based environment
  • Already functioning as an Architect or Principal Engineer
  • Proven track record as an Individual Contributor (IC)
  • Strong engineering fundamentals with a passion for scalable software systems
  • No compromise on code quality, craftsmanship, and best practices

🧪 Interview Process :

  1. Round 1: Technical pairing round
  2. Rounds 2 & 3: Technical rounds with panel (code pairing + architecture)
  3. Final Round: HR and offer discussion
Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote, Delhi
4yrs+
Upto ₹34L / yr (Varies
)
skill iconNodeJS (Node.js)
skill iconVue.js
skill iconReact.js
skill iconMongoDB
Apache Kafka
+3 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website - https://www.gohighlevel.com/

YouTube Channel -https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post - https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage


About the Role:

We are seeking a highly skilled Full Stack Developer to join our CRM team. The ideal candidate will have a strong background in Node.js and Vue.js and possess hands-on experience in various technologies and concepts. Responsible for implementing visual elements that users see and interact within a web application.


Requirements:

  • Collaborate with cross-functional teams to design, develop, and maintain CRM applications and features
  • Build and optimize user interfaces using Vue.js for an exceptional user experience
  • Develop server-side logic and APIs using Node.js
  • Implement robust data storage and retrieval solutions with a focus on ElasticSearch, Data Indexing, Database Sharding, and Autoscaling
  • Integrate Message Queues, Pub-sub systems, and Event-Based architectures to enable real-time data processing and event-driven workflows
  • Handle real-time data migration and event processing tasks efficiently
  • Utilize messaging systems such as Active MQ, Rabbit MQ, and Kafka to manage data flow and communication within the CRM ecosystem
  • Collaborate closely with front-end and back-end developers, product managers, and data engineers to deliver high-quality solutions
  • Optimize applications for maximum speed and scalability
  • Ensure the security and integrity of data and application systems
  • Troubleshoot and resolve technical issues, bugs, and performance bottlenecks
  • Stay updated with emerging technologies and industry trends, and make recommendations for adoption when appropriate
  • Participate in code reviews, maintain documentation, and contribute to a culture of continuous improvement
  • Provide technical support and mentorship to junior developers when necessary


Responsibilities:

  • Good hands-on experience with Node.Js and Vue.js (or React/Angular)
  • Strong understanding of ElasticSearch, Data Indexing, Database Sharding, and Auto Scaling techniques
  • Experience working with Message Queues, Pub-sub patterns, and Event-Based architecture
  • Proficiency in Real-time Data Migration and Real-time Event Processing
  • Familiarity with messaging systems like Active MQ, Rabbit MQ, and Kafka
  • Bachelor's degree or equivalent experience in Engineering or a related field of study
  • Expertise with MongoDB
  • Proficient understanding of code versioning tools, such as Git
  • Strong communication and problem-solving skills


What to expect when you apply?

  • Exploratory Call
  • Technical Round I/II
  • Assignment
  • Cultural Fitment Round
Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Bengaluru (Bangalore)
3 - 5 yrs
₹20L - ₹25L / yr
ETL
SQL
Apache Spark
Apache Kafka

Role & Responsibilities

About the Role:


We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.

Key responsibilities:


Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.

Implement efficient data modeling techniques to optimize performance and scalability of data systems.

Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.

Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.

Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.

Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.

Design and develop batch pipelines for scheduled data processing tasks.

Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.

Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.

Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.

Read more
Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Agency job
via HyrHub by Neha Koshy
Bengaluru (Bangalore)
2 - 4 yrs
₹15L - ₹25L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
SQL
Algorithms
+6 more

Mandatory Skills:

  • Java
  • Kafka
  • Spring Boot
  • SQL / MySQL
  • Algorithms
  • Data Structures

Key Responsibilities:

  • Design and Develop large scale sub-systems
  • To periodically explore latest technologies (esp Open Source) and prototype sub-systems
  • Be a part of the team that develops the next gen Customer Data Platform
  • Build components to make the customer data platform more efficient and scalable

Qualifications:

  • 2-4 years of relevant experience with Algorithms, Data Structures, & Optimizations in addition to Coding.
  • Education: B.E/B-Tech/M-Tech/M.S/MCA Computer Science or Equivalent from premier institutes only
  • Candidates with CGPA 9 or above will be preferred.

Skill Set:

  • Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
  • Good System design and Class design
  • Good knowledge of Databases (Both SQL/NOSQL)
  • Good knowledge of Kafka, Streaming Systems
  • Good Knowledge of Java, Unit Testing

Soft Skills:

  • Has appreciation of technology and its ability to create value in the CDP domain
  • Excellent written and verbal communication skills
  • Active & contributing team member
  • Strong work ethic with demonstrated ability to meet and exceed commitments
  • Others: Experience of having worked in a start-up is a plus
Read more
Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises

Agency job
via HyrHub by Neha Koshy
Bengaluru (Bangalore)
1 - 2 yrs
₹7L - ₹12L / yr
skill iconJava
Apache Kafka
SQL
skill iconSpring Boot
Algorithms
+6 more

Mandatory Skills:

  • Java
  • Kafka
  • Spring Boot
  • SQL / MySQL
  • Algorithms
  • Data Structures

Key Responsibilities:

  • Design and Develop large scale sub-systems.
  • To periodically explore latest technologies (esp. Open Source) and prototype sub-systems.
  • Be a part of the team that develops the next-gen Targeting platform.
  • Build components to make the customer data platform more efficient and scalable.

Qualifications:

  • 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
  • Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes.
  • Candidates with CGPA 9 or above will be preferred.

Skill Set:

  • Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding).
  • Good knowledge of Databases - SQL, NoSQL.
  • Knowledge of Unit Testing a plus.

Soft Skills:

  • Has an appreciation of technology and its ability to create value in the marketing domain.
  • Excellent written and verbal communication skills.
  • Active & contributing team member.
  • Strong work ethic with demonstrated ability to meet and exceed commitments.
  • Others: Experience of having worked in a start-up is a plus.
Read more
Data Havn

Data Havn

Agency job
via Infinium Associate by Toshi Srivastava
Noida
5 - 9 yrs
₹40L - ₹60L / yr
skill iconPython
SQL
Data engineering
Snowflake
ETL
+5 more

About the Role:

We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.

Responsibilities:

  • Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  • Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
  • Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
  • Team Management: Able to handle team.
  • Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
  • Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
  • Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
  • Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.

 

 

 

 

Skills:

  • Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
  • Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
  • Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
  • Understanding of data modeling and data architecture concepts.
  • Experience with ETL/ELT tools and frameworks.
  • Excellent problem-solving and analytical skills.
  • Ability to work independently and as part of a team.

Preferred Qualifications:

  • Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
  • Knowledge of machine learning and artificial intelligence concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Certification in cloud platforms or data engineering.


Read more
Bengaluru (Bangalore)
6 - 9 yrs
₹30L - ₹60L / yr
skill iconPython
skill iconDjango
skill iconFlask
skill iconPostgreSQL
Apache Kafka
+2 more

Role & Responsibilities

Lead the design, development, and deployment of complex, scalable, reliable, and highly available features for world-class SaaS products and services.

Guide the engineering team in adopting best practices for software development, code quality, and architecture.

Make strategic architectural and technical decisions, ensuring the scalability, security, and performance of software applications.

Proactively identify, prioritize, and address technical debt to improve system performance, maintainability, and long-term scalability, ensuring a solid foundation for future development.

Collaborate with cross-functional teams (product managers, designers, and stakeholders) to define project scope, requirements, and timelines.

Mentor and coach team members, providing technical guidance and fostering professional development.

Oversee code reviews, ensuring adherence to best practices and maintaining high code quality standards.

Drive continuous improvement in development processes, tools, and technologies to increase team productivity and product quality.

Stay updated with the latest industry trends and emerging technologies to drive innovation and keep the team at the cutting edge.

Ensure project timelines and goals are met, managing risks and resolving any technical challenges that arise during development.

Foster a collaborative and inclusive team culture, promoting open communication and problem-solving.

Imbibe and maintain a strong customer delight attitude while designing and building products.

Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹10L - ₹15L / yr
snowflake
SQL
data intergration tool
ETL/ELT Pipelines
SQL Queries
+5 more

Job Summary


We are seeking a skilled Snowflake Developer to design, develop, migrate, and optimize Snowflake-based data solutions. The ideal candidate will have hands-on experience with Snowflake, SQL, and data integration tools to build scalable and high-performance data pipelines that support business analytics and decision-making.


Key Responsibilities:


Develop and implement Snowflake data warehouse solutions based on business and technical requirements.

Design, develop, and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and processing.

Write and optimize complex SQL queries for data retrieval, performance enhancement, and storage optimization.

Collaborate with data architects and analysts to create and refine efficient data models.

Monitor and fine-tune Snowflake query performance and storage optimization strategies for large-scale data workloads.

Ensure data security, governance, and access control policies are implemented following best practices.

Integrate Snowflake with various cloud platforms (AWS, Azure, GCP) and third-party tools.

Troubleshoot and resolve performance issues within the Snowflake environment to ensure high availability and scalability.

Stay updated on Snowflake best practices, emerging technologies, and industry trends to drive continuous improvement.


Qualifications:

Education: Bachelor’s or master’s degree in computer science, Information Systems, or a related field.


Experience:


6+ years of experience in data engineering, ETL development, or similar roles.

3+ years of hands-on experience in Snowflake development.


Technical Skills:


Strong proficiency in SQL, Snowflake Schema Design, and Performance Optimization.

Experience with ETL/ELT tools like dbt, Talend, Matillion, or Informatica.

Proficiency in Python, Java, or Scala for data processing.

Familiarity with cloud platforms (AWS, Azure, GCP) and integration with Snowflake.

Experience with data governance, security, and compliance best practices.

Strong analytical, troubleshooting, and problem-solving skills.

Communication: Excellent communication and teamwork abilities, with a focus on collaboration across teams.


Preferred Skills:


Snowflake Certification (e.g., SnowPro Core or Advanced).

Experience with real-time data streaming using tools like Kafka or Apache Spark.

Hands-on experience with CI/CD pipelines and DevOps practices in data environments.

Familiarity with BI tools like Tableau, Power BI, or Looker for data visualization and reporting.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Bengaluru (Bangalore)
5 - 8 yrs
₹30L - ₹45L / yr
skill iconSpring Boot
Spring
Microservices
skill iconJava
skill iconAmazon Web Services (AWS)
+7 more

What we Require


We are recruiting technical experts with the following core skills and hands-on experience on


Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot

Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes

Expert proficiency in designing both producer and consumer types of Rest services.

Expert proficiency in Unit testing and Code Quality tools.

Expert proficiency in ensuring code coverage.

Expert proficiency in understanding High-Level Design and translating that to Low-Level design

Hands-on experience working with no-SQL databases.

Experience working in an Agile development process - Scrum.

Experience working closely with engineers and software cultures.

Ability to think at a high level about product strategy and customer journeys.

Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.

Excellent communication skills to clearly articulate design decisions.

Read more
Trellissoft Inc.

at Trellissoft Inc.

3 candid answers
Nikita Sinha
Posted by Nikita Sinha
Goa
5 - 10 yrs
Upto ₹20L / yr (Varies
)
skill iconJava
JUnit
Apache Kafka
skill iconRedis
RabbitMQ

Job Responsibilities:


• Excellent English communication.

• Design, develop, and maintain robust and scalable Java applications.

• Write clean, maintainable, and efficient code in Core Java.

Develop and execute unit tests using JUnit to ensure code quality.

• Manage database connections and queries using JDBC with PostgreSQL.

• Implement and integrate messaging systems with ActiveMQ or RabbitMQ, or Kafka.

• Utilize Redis for caching and optimizing application performance.

• Collaborate with cross-functional teams to define, design, and ship new features.

• Troubleshoot and debug applications to optimize performance and ensure reliability.

• Follow best practices for software development, including code reviews and version control.

• Proficient in debugging and diagnosing complex, multi-layered bugs in large codebases and custom frameworks.

• Skilled in using software development tools to identify and resolve both isolated and ambiguous issues.

• Highly effective in communication and collaboration with peers.

• Demonstrates advanced technical expertise relevant to their role.

• Capable of maintaining and improving existing code while ensuring high performance and reliability.

• Strong problem-solving skills with the ability to quickly adapt to new technologies and methodologies.


Preferred Competencies:

• Experience with microservices architecture.

• Knowledge of cloud platforms such as AWS, Azure, or GCP.

• Familiarity with CI/CD pipelines.

Read more
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Noida, Delhi, Gurugram, Ghaziabad, Faridabad
5 - 8 yrs
₹4L - ₹20L / yr
skill iconJava
06692
skill iconAngular (2+)
skill iconSpring Boot
Hibernate (Java)
+23 more

Job Summary:

We are seeking passionate Developers with experience in Microservices architecture to join our team in Noida. The ideal candidate should have hands-on expertise in Java, Spring Boot, Hibernate, and front-end technologies like Angular, JavaScript, and Bootstrap. You will be responsible for developing enterprise-grade software applications that enhance patient safety worldwide.


Key Responsibilities:

  • Develop and maintain applications using Microservices architecture.
  • Work with modern technologies like Java, Spring Boot, Hibernate, Angular, Kafka, Redis, and Hazelcast.
  • Utilize AWS, Git, Nginx, Tomcat, Oracle, Jira, Confluence, and Jenkins for development and deployment.
  • Collaborate with cross-functional teams to design and build scalable enterprise applications.
  • Develop intuitive UI/UX components using Bootstrap, jQuery, and JavaScript.
  • Ensure high-performance, scalable, and secure applications for Fortune 100 pharmaceutical companies.
  • Participate in Agile development, managing changing priorities effectively.
  • Conduct code reviews, troubleshoot issues, and optimize application performance.


Required Skills & Qualifications:

  • 5+ years of hands-on experience in Java 7/8, Spring Boot, and Hibernate.
  • Strong knowledge of OOP concepts and Design Patterns.
  • Experience working with relational databases (Oracle/MySQL).
  • Proficiency in Bootstrap, JavaScript, jQuery, HTML, and Angular.
  • Hands-on experience in Microservices-based application development.
  • Strong problem-solving, debugging, and analytical skills.
  • Excellent communication and collaboration abilities.
  • Ability to adapt to new technologies and manage multiple priorities.
  • Experience in developing high-quality web applications.


Good to Have:

  • Exposure to Kafka, Redis, and Hazelcast.
  • Experience working with cloud-based solutions (AWS preferred).
  • Familiarity with DevOps tools like Jenkins, Docker, and Kubernetes.


Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Pune, Hyderabad, Mumbai, Navi Mumbai
5 - 40 yrs
₹8.5L - ₹75L / yr
Microservices
Architecture
API
NOSQL Databases
skill iconMongoDB
+33 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-Fri, In office role with excellent perks and benefits!


Key Responsibilities:

1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.

2. Build and implement scalable and robust microservices and integrate API gateways.

3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).

4. Implement real-time data pipelines using Kafka.

5. Collaborate with front-end developers to ensure seamless integration of backend services.

6. Write clean, reusable, and efficient code following best practices, including design patterns.

7. Troubleshoot, debug, and enhance existing systems for improved performance.


Mandatory Skills:

1. Proficiency in at least one backend technology: Node.js or Python, or Java.


2. Strong experience in:

i. Microservices architecture,

ii. API gateways,

iii. NoSQL databases (e.g., MongoDB, DynamoDB),

iv. Kafka

v. Data structures (e.g., arrays, linked lists, trees).


3. Frameworks:

i. If Java : Spring framework for backend development.

ii. If Python: FastAPI/Django frameworks for AI applications.

iii. If Node: Express.js for Node.js development.


Good to Have Skills:

1. Experience with Kubernetes for container orchestration.

2. Familiarity with in-memory databases like Redis or Memcached.

3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.

Read more
HighLevel Inc.

at HighLevel Inc.

1 video
31 recruiters
Reshika Mendiratta
Posted by Reshika Mendiratta
Remote, Delhi
3.5yrs+
Upto ₹30L / yr (Varies
)
skill iconNodeJS (Node.js)
skill iconJavascript
skill iconVue.js
skill iconReact.js
skill iconAngular (2+)
+8 more

About HighLevel:

HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. With a focus on streamlining marketing efforts and providing comprehensive solutions, HighLevel helps businesses of all sizes achieve their marketing goals. We currently have ~1200 employees across 15 countries, working remotely as well as in our headquarters, which is located in Dallas, Texas. Our goal as an employer is to maintain a strong company culture, foster creativity and collaboration, and encourage a healthy work-life balance for our employees wherever they call home.


Our Website - https://www.gohighlevel.com/

YouTube Channel - https://www.youtube.com/channel/UCXFiV4qDX5ipE-DQcsm1j4g

Blog Post - https://blog.gohighlevel.com/general-atlantic-joins-highlevel/


Our Customers:

HighLevel serves a diverse customer base, including over 60K agencies & entrepreneurs and 500K businesses globally. Our customers range from small and medium-sized businesses to enterprises, spanning various industries and sectors.


Scale at HighLevel:

We operate at scale, managing over 40 billion API hits and 120 billion events monthly, with more than 500 micro-services in production. Our systems handle 200+ terabytes of application data and 6 petabytes of storage


About the Role:

We are looking for an experienced software engineer with strong technical and communication skills who has worked extensively on backend and frontend engineering systems that process large amounts of data at scale and manage services that handle thousands of requests every minute.

This role has an equal mixture of backend and frontend responsibilities. You will be expected to be autonomous, guide other developers who might need technical help, collaborate with engineers from other teams, product managers and customer success and support representatives.


Responsibilities:

  • Create new reporting features and improve the existing functionalities
  • Build backend & Frontend API features and architecture
  • Work cross-functionally across our platform, scheduling and appointments, CRM and automations teams
  • Drive performance through benchmarking and optimisation
  • Work with a wide range of systems, processes, and technologies to own and solve problems from end to end
  • Collaborate closely with our leadership team including engineers, designers and product managers to build new features and products
  • Uphold high engineering standards and bring consistency to the many codebases and systems you will encounter


Requirements:

  • 4+ years of experience as a full-stack software engineer
  • 2+ years of experience with Vue.js
  • Proficient with various programming languages and frameworks including but not limited to Node.js, JavaScript, Python and TypeScript
  • Experience with Docker and Kubernetes
  • Experience with databases such as MySQL and MongoDB. Good to have a working knowledge of Redis and Firebase/FireStore
  • Experience with integration of various third party platforms including Google AdWords, Facebook Marketing APIs, Twilio, Google Analytics etc.
  • Understanding of various tools and techniques around caching, concurrency, highly available, reliable and scalable systems
  • Must be able to work with a team and collaborate remotely
  • Driven by product quality, and innately know how to balance trade-offs with time to launch new features
  • A keen eye for design and love to think about user flows and user experiences


Interview Process:

  • Technical Interview I
  • Assignment
  • Technical Interview II
  • Cultural Fitment Round


EEO Statement:

The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Read more
Wekan Enterprise Solutions

at Wekan Enterprise Solutions

2 candid answers
Deepak  N
Posted by Deepak N
Bengaluru (Bangalore), Chennai
12 - 22 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconMongoDB
Microservices
skill iconJavascript
TypeScript
+3 more

Architect


Experience - 12+ yrs


About Wekan Enterprise Solutions


Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.

 

Job Description

We are looking for passionate architects eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?

You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.

Location - Chennai or Bangalore


●     Relevant experience of 12+ years building high-performance applications with at least 3+ years as an architect.

●     Good problem solving skills

●     Strong mentoring capabilities

●     Good understanding of software development life cycle

●     Strong experience in system design and architecture

●     Strong focus on quality of work delivered

●     Excellent verbal and written communication skills

 

Required Technical Skills

 

● Extensive hands-on experience building high-performance applications using Node.Js (Javascript/Typescript) and .NET/ Golang / Java / Python.

● Strong experience with appropriate framework(s).

● Wellversed in monolithic and microservices architecture.

● Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases

● Experience working with 3rd party integrations ranging from authentication, cloud services, etc.

● Hands-on experience with Kafka or RabbitMQ.

● Handsonexperience with CI/CD pipelines and atleast 1 cloud provider- AWS / GCP / Azure

● Strong experience writing and maintaining clear documentation

  

Good to have skills:

 

●     Experience working with frontend technologies - React.Js or Vue.Js or Angular.

●     Extensive experience consulting with customers directly for defining architecture or system design.

●     Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
8 - 13 yrs
₹70L - ₹90L / yr
Data engineering
Apache Spark
Apache Kafka
skill iconJava
skill iconPython
+6 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Pixis AI

at Pixis AI

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹45L / yr (Varies
)
skill iconPython
skill iconGo Programming (Golang)
Apache Kafka
Apache Spark
Apache Airflow
+5 more

Key Responsibilities:

  • Design, build, and maintain scalable, real-time data pipelines using Apache Flink (or Apache Spark).
  • Work with Apache Kafka (mandatory) for real-time messaging and event-driven data flows.
  • Build data infrastructure on Lakehouse architecture, integrating data lakes and data warehouses for efficient storage and processing.
  • Implement data versioning and cataloging using Apache Nessie, and optimize datasets for analytics with Apache Iceberg.
  • Apply advanced data modeling techniques and performance tuning using Apache Doris or similar OLAP systems.
  • Orchestrate complex data workflows using DAG-based tools like Prefect, Airflow, or Mage.
  • Collaborate with data scientists, analysts, and engineering teams to develop and deliver scalable data solutions.
  • Ensure data quality, consistency, performance, and security across all pipelines and systems.
  • Continuously research, evaluate, and adopt new tools and technologies to improve our data platform.

Skills & Qualifications:

  • 3–6 years of experience in data engineering, building scalable data pipelines and systems.
  • Strong programming skills in Python, Go, or Java.
  • Hands-on experience with stream processing frameworks – Apache Flink (preferred) or Apache Spark.
  • Mandatory experience with Apache Kafka for stream data ingestion and message brokering.
  • Proficiency with at least one DAG-based orchestration tool like Airflow, Prefect, or Mage.
  • Solid understanding and hands-on experience with SQL and NoSQL databases.
  • Deep understanding of data lakehouse architectures, including internal workings of data lakes and data warehouses, not just usage.
  • Experience working with at least one cloud platform, preferably AWS (GCP or Azure also acceptable).
  • Strong knowledge of distributed systems, data modeling, and performance optimization.

Nice to Have:

  • Experience with Apache Doris or other MPP/OLAP databases.
  • Familiarity with CI/CD pipelines, DevOps practices, and infrastructure-as-code in data workflows.
  • Exposure to modern data version control and cataloging tools like Apache Nessie.
Read more
Pixis AI

at Pixis AI

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹40L / yr (Varies
)
skill iconNodeJS (Node.js)
skill iconReact.js
SQL
skill iconRedis
Apache Kafka

Key Responsibilities:


• Design, develop, and maintain scalable and robust full-stack applications using

cutting-edge technologies.

• Collaborate with product managers, designers, and other stakeholders to

understand requirements and translate them into technical specifications.

• Implement front-end and back-end features with a focus on usability,

performance, and security.

• Write clean, efficient, and maintainable code while following best practices and

coding standards.

• Conduct code reviews, provide constructive feedback, and mentor junior

developers to foster a culture of continuous improvement.

• Troubleshoot and debug issues, identify root causes, and implement solutions to

ensure smooth application operation.

• Stay updated on emerging technologies and industry trends and apply them to

enhance our software development process and capabilities.


Requirements & Skills:


• Bachelor’s degree in computer science, Engineering, or a related field.

• 4+ years of professional experience in software development, with a focus on fullstack development.

• Proficiency in programming languages such as JavaScript (Node.js), Python, orJava.

• Experience with front-end frameworks/libraries such as React, Angular, or Vue.js.

• Solid understanding of back-end frameworks/libraries such as Express.js, Django, or Spring Boot.

• Experience with database systems (SQL and NoSQL) and ORMs (e.g., Sequelize, SQLAlchemy, Hibernate).

Read more
Data Axle

at Data Axle

2 candid answers
Eman Khan
Posted by Eman Khan
Remote, Pune
8 - 13 yrs
Best in industry
Data architecture
Systems design
Spark
Apache Kafka
Flink
+5 more

About Data Axle:

 

Data Axle Inc. has been an industry leader in data, marketing solutions, sales, and research for over 50 years in the USA. Data Axle now has an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business and consumer databases.

Data Axle India is recognized as a Great Place to Work! This prestigious designation is a testament to our collective efforts in fostering an exceptional workplace culture and creating an environment where every team member can thrive.

 

General Summary:

 

As a Digital Data Management Architect, you will design, implement, and optimize advanced data management systems that support processing billions of digital transactions, ensuring high availability and accuracy. You will leverage your expertise in developing identity graphs, real-time data processing, and API integration to drive insights and enhance user experiences across digital platforms. Your role is crucial in building scalable and secure data architectures that support real-time analytics, identity resolution, and seamless data flows across multiple systems and applications.

 

Roles and Responsibilities:

 

  1. Data Architecture & System Design:
  • Design and implement scalable data architectures capable of processing billions of digital transactions in real-time, ensuring low latency and high availability.
  • Architect data models, workflows, and storage solutions to enable seamless real-time data processing, including stream processing and event-driven architectures.
  1. Identity Graph Development:
  • Lead the development and maintenance of a comprehensive identity graph to unify disparate data sources, enabling accurate identity resolution across channels.
  • Develop algorithms and data matching techniques to enhance identity linking, while maintaining data accuracy and privacy.
  1. Real-Time Data Processing & Analytics:
  • Implement real-time data ingestion, processing, and analytics pipelines to support immediate data availability and actionable insights.
  • Work closely with engineering teams to integrate and optimize real-time data processing frameworks such as Apache Kafka, Apache Flink, or Spark Streaming.
  1. API Development & Integration:
  • Design and develop real-time APIs that facilitate data access and integration across internal and external platforms, focusing on security, scalability, and performance.
  • Collaborate with product and engineering teams to define API specifications, data contracts, and SLAs to meet business and user requirements.
  1. Data Governance & Security:
  • Establish data governance practices to maintain data quality, privacy, and compliance with regulatory standards across all digital transactions and identity graph data.
  • Ensure security protocols and access controls are embedded in all data workflows and API integrations to protect sensitive information.
  1. Collaboration & Stakeholder Engagement:
  • Partner with data engineering, analytics, and product teams to align data architecture with business requirements and strategic goals.
  • Provide technical guidance and mentorship to junior architects and data engineers, promoting best practices and continuous learning.

 

 

Qualifications:

 

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • 10+ years of experience in data architecture, digital data management, or a related field, with a proven track record in managing billion+ transactions.
  • Deep experience with identity resolution techniques and building identity graphs.
  • Strong proficiency in real-time data processing technologies (e.g., Kafka, Flink, Spark) and API development (RESTful and/or GraphQL).
  • In-depth knowledge of database systems (SQL, NoSQL), data warehousing solutions, and cloud-based platforms (AWS, Azure, or GCP).
  • Familiarity with data privacy regulations (e.g., GDPR, CCPA) and data governance best practices.

 

This position description is intended to describe the duties most frequently performed by an individual in this position. It is not intended to be a complete list of assigned duties but to describe a position level.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
11 - 18 yrs
₹70L - ₹80L / yr
skill iconJava
skill iconGo Programming (Golang)
skill iconNodeJS (Node.js)
skill iconPython
Apache Kafka
+7 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Remote only
11 - 18 yrs
₹50L - ₹70L / yr
skill iconJava
Data engineering
skill iconNodeJS (Node.js)
skill iconPython
skill iconGo Programming (Golang)
+5 more

Role & Responsibilities

Lead and mentor a team of data engineers, ensuring high performance and career growth.

Architect and optimize scalable data infrastructure, ensuring high availability and reliability.

Drive the development and implementation of data governance frameworks and best practices.

Work closely with cross-functional teams to define and execute a data roadmap.

Optimize data processing workflows for performance and cost efficiency.

Ensure data security, compliance, and quality across all data platforms.

Foster a culture of innovation and technical excellence within the data team.

Read more
Remote only
7 - 12 yrs
₹25L - ₹40L / yr
Spark
skill iconJava
Apache Kafka
Big Data
Apache Hive
+5 more

Job Title: Big Data Engineer (Java Spark Developer – JAVA SPARK EXP IS MUST)

Location: Chennai, Hyderabad, Pune, Bangalore (Bengaluru) / NCR Delhi

Client: Premium Tier 1 Company

Payroll: Direct Client

Employment Type: Full time / Perm

Experience: 7+ years

 

Job Description:

We are looking for a skilled Big Data Engineers using Java Spark with 7+ years of experience in Big Data / legacy platforms, who can join immediately. Desired candidate should have design, development and optimization of real-time & batch data pipelines experience in Big Data environment at an enterprise scale applications. You will work on building scalable and high-performance data processing solutions, integrating real-time data streams, and building a reliable Data platforms. Strong troubleshooting, performance tuning, and collaboration skills are key for this role.

 

Key Responsibilities:

·      Develop data pipelines using Java Spark and Kafka.

·      Optimize and maintain real-time data pipelines and messaging systems.

·      Collaborate with cross-functional teams to deliver scalable data solutions.

·      Troubleshoot and resolve issues in Java Spark and Kafka applications.

 

Qualifications:

·      Experience in Java Spark is must

·      Knowledge and hands-on experience using distributed computing, real-time data streaming, and big data technologies

·      Strong problem-solving and performance optimization skills

·      Looking for immediate joiners

 

If interested, please share your resume along with the following details

1)    Notice Period

2)    Current CTC

3)    Expected CTC

4)    Have Experience in Java Spark - Y / N (this is must)

5)    Any offers in hand

 

Thanks & Regards,

LION & ELEPHANTS CONSULTANCY PVT LTD TEAM

SINGAPORE | INDIA

 

Read more
Mernplus Technologies
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹17L / yr
skill iconJava
camunda
Apache Camel
Apache Kafka
karaf

We are seeking a skilled Java Developer with 5+ years of experience in Java, Camunda, Apache Camel, Kafka, and Apache Karaf. The ideal candidate should have expertise in workflow automation, message-driven architectures, and enterprise integration patterns. Strong problem-solving skills and hands-on experience in microservices and event-driven systems are required.

Read more
Gipfel & Schnell Consultings Pvt Ltd
Bengaluru (Bangalore)
5 - 12 yrs
Best in industry
DevOps
azure
Terraform
Powershell
Apache Kafka
+1 more

Mandatory Skills:


  • AZ-104 (Azure Administrator) experience
  • CI/CD migration expertise
  • Proficiency in Windows deployment and support
  • Infrastructure as Code (IaC) in Terraform
  • Automation using PowerShell
  • Understanding of SDLC for C# applications (build/ship/run strategy)
  • Apache Kafka experience
  • Azure web app


Good to Have Skills:


  • AZ-400 (Azure DevOps Engineer Expert)
  • AZ-700 Designing and Implementing Microsoft Azure Networking Solutions
  • Apache Pulsar
  • Windows containers
  • Active Directory and DNS
  • SAST and DAST tool understanding
  • MSSQL database
  • Postgres database
  • Azure security
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Noida
5 - 12 yrs
₹5L - ₹20L / yr
skill iconJava
skill iconSpring Boot
Hibernate (Java)
Object Oriented Programming (OOPs)
Design patterns
+8 more

Position Title : Java Full Stack Developer

Location : Noida Sector 125

Experience : 5+ Years

Availability : Immediate


Job Summary :

We are looking for a Java Full Stack Developer with expertise in Microservices architecture to join our team.

The ideal candidate should have hands-on experience in Java, Spring Boot, Hibernate, and front-end technologies like Angular, JavaScript, and Bootstrap. You will work on enterprise-grade applications that enhance patient safety worldwide.


Key Responsibilities :

  • Design, develop, and maintain applications based on Microservices architecture.
  • Work with Java, Spring Boot, Hibernate, Angular, Kafka, Redis, and Hazelcast to build scalable solutions.
  • Utilize AWS, Git, Nginx, Tomcat, Oracle, Jira, Confluence, and Jenkins for development and deployment.
  • Collaborate with cross-functional teams to develop enterprise applications.
  • Develop intuitive UI/UX components using Bootstrap, jQuery, and JavaScript.
  • Ensure applications meet performance, scalability, and security requirements.
  • Participate in Agile development while efficiently handling changing priorities.
  • Conduct code reviews, debugging, and performance optimization.

Required Skills & Qualifications :

5+ Years of hands-on experience in Java 7/8, Spring Boot, and Hibernate.

✔ Strong understanding of OOP concepts and Design Patterns.

✔ Experience working with relational databases like Oracle/MySQL.

✔ Proficiency in Bootstrap, JavaScript, jQuery, HTML, and Angular.

✔ Hands-on experience in Microservices-based application development.

✔ Strong problem-solving, debugging, and analytical skills.

✔ Excellent communication and collaboration skills.

✔ Ability to adapt to new technologies and handle multiple priorities.

✔ Experience in developing high-quality web applications.


Good to Have :

➕ Exposure to Kafka, Redis, and Hazelcast.

➕ Experience working with cloud-based solutions (AWS preferred).

➕ Familiarity with DevOps tools like Jenkins, Docker, and Kubernetes.


Why Join Us?

✅ Work on cutting-edge technologies and enterprise-level applications.

✅ Collaborative and innovative work environment.

✅ Competitive salary and career growth opportunities.

Read more
ZyBiSys

at ZyBiSys

4 candid answers
8 recruiters
Subash S
Posted by Subash S
Tiruchirappalli, tamilnadu
5 - 10 yrs
₹20L - ₹25L / yr
skill iconNodeJS (Node.js)
skill iconReact.js
skill iconMongoDB
skill iconGo Programming (Golang)
Nginx
+14 more

Job Role: Senior Full Stack Developer

Location: Trichy

Job Type: Full Time

Experience Required: 5+ Years

Reporting to : Product Head


About Us:


At Zybisys Consulting Services LLP, we are a leading company in Cloud Managed Services and Cloud Computing. We believe in creating a vibrant and inclusive workplace where talented people can grow and succeed. We are looking for a dedicated leader who is passionate about supporting our team, developing talent, and enhancing our company culture.


Role Overview:

Are you a seasoned Full Stack Developer with a passion for crafting innovative solutions? We are looking for an experienced Senior Full Stack Developer to enhance our team and lead the development of innovative solutions.


Key Responsibilities:

  • Develop and Maintain Applications: Design, develop, and maintain scalable and efficient full-stack applications using modern technologies.
  • Database Design: Expertise in both relational and NoSQL databases, including schema design, query optimization, and data modeling.
  • Collaborate with Teams: Work closely with front-end and back-end developers along with the Engineering team to integrate and optimize APIs and services.
  • Implement Best Practices: Ensure high-quality code, adherence to best practices, and efficient use of technologies.
  • Troubleshoot and Debug: Identify and resolve complex issues, providing solutions and improvements.
  • Code Review and Quality Assurance: Skill in reviewing code, ensuring adherence to coding standards, and implementing best practices for software quality.
  • Agile Methodologies: Experience with Agile frameworks (e.g., Scrum, Kanban) to facilitate iterative development and continuous improvement.
  • Test-Driven Development (TDD): Knowledge of TDD practices, writing unit tests, and integrating automated testing (CI/CD) into the development workflow.
  • Technical Documentation: Ability to write clear and concise technical documentation for codebases, APIs, and system architecture.


Technical Skills:

  • Backend: Node.js, Express.js, Python, Golang, gRPC
  • Frontend: React.js, Next.js, HTML, HTML5, CSS3, jQuery
  • Database: MongoDB, MySQL, Redis, OpenSearch
  • API : RESTful APIs, SOAP services, or GraphQL
  • Tools & Technologies: Docker, Git, Kafka
  • Design & Development: Figma, Linux
  • Containers & container orchestration: Docker, Kubernetes
  • Networking & OS Knowledge


What We Offer:

  • Growth Opportunities: Expand your skills and career within a forward-thinking company.
  • Collaborative Environment: Join a team that values innovation and teamwork.


If your ready to take on exciting challenges and work in a collaborative environment, wed love to hear from you!


Apply now to join our team as a Senior Full Stack Developer and make waves with your skills!

Read more
Intellikart Ventures LLP
Prajwal Shinde
Posted by Prajwal Shinde
Pune
2 - 5 yrs
₹9L - ₹15L / yr
PowerBI
SQL
ETL
snowflake
Apache Kafka
+1 more

Experience: 4+ years.

Location: Vadodara & Pune

Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines

What you'll be doing:

  • Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
  • Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
  • Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
  • Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
  • Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
  • Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
  • Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
  • Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
  • Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.


What you need:

Basic Skills:


  • 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
  • Strong experience with Apache Kafka for stream processing and real-time data integration.
  • Proficiency in SQL and ETL/ELT processes.
  • Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
  • Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
  • Knowledge of data governance, security, and compliance best practices.
  • Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
  • Ability to work in a collaborative team environment and communicate effectively with cross-functional teams


Responsibilities:

  • Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
  • Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
  • Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
  • Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
  • Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
  • Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
  • Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
  • Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
  • Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
  • Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform. 


Read more
Chennai
5 - 7 yrs
₹15L - ₹25L / yr
Apache Kafka
Google Cloud Platform (GCP)
BCP
DevOps

Job description

 Location: Chennai, India

 Experience: 5+ Years

 Certification: Kafka Certified (Mandatory); Additional Certifications are a Plus


Job Overview:

We are seeking an experienced DevOps Engineer specializing in GCP Cloud Infrastructure Management and Kafka Administration. The ideal candidate should have 5+ years of experience in cloud technologies, Kubernetes, and Kafka, with a mandatory Kafka certification.


Key Responsibilities:

Cloud Infrastructure Management:

· Manage and update Kubernetes (K8s) on GKE.

· Monitor and optimize K8s resources, including pods, storage, memory, and costs.

· Oversee the general monitoring and maintenance of environments using:

o OpenSearch / Kibana

o KafkaUI

o BGP

o Grafana / Prometheus


Kafka Administration:

· Manage Kafka brokers and ACLs.

· Hands-on experience in Kafka administration (preferably Confluent Kafka).

· Independently debug, optimize, and implement Kafka solutions based on developer and business needs.


Other Responsibilities:

· Perform random investigations to troubleshoot and enhance infrastructure.

· Manage PostgreSQL databases efficiently.

· Administer Jenkins pipelines, supporting CI/CD implementation and maintenance.


Required Skills & Qualifications:

· Kafka Certified Engineer (Mandatory).

· 5+ years of experience in GCP DevOps, Cloud Infrastructure, and Kafka Administration.

· Strong expertise in Kubernetes (K8s), Google Kubernetes Engine (GKE), and cloud environments.

· Hands-on experience with monitoring tools like Grafana, Prometheus, OpenSearch, and Kibana.

· Experience managing PostgreSQL databases.

· Proficiency in Jenkins pipeline administration.

· Ability to work independently and collaborate with developers and business stakeholders.

If you are passionate about DevOps, Cloud Infrastructure, and Kafka, and meet the above qualifications, we encourage you to apply!


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
5 - 12 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+17 more

Job Title : Senior AWS Data Engineer

Experience : 5+ Years

Location : Gurugram

Employment Type : Full-Time

Job Summary :

Seeking a Senior AWS Data Engineer with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.

Key Responsibilities :

  • Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
  • Maintain data lakes & warehouses for analytics.
  • Ensure data integrity through quality checks.
  • Collaborate with data scientists & engineers to deliver solutions.

Qualifications :

  • 7+ Years in Data Engineering.
  • Expertise in AWS services, SQL, Python, Spark, Kafka.
  • Experience with CI/CD, DevOps practices.
  • Strong problem-solving skills.

Preferred Skills :

  • Experience with Snowflake, Databricks.
  • Knowledge of BI tools (Tableau, Power BI).
  • Healthcare/Insurance domain experience is a plus.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
7 - 15 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+20 more

Job Title : Tech Lead - Data Engineering (AWS, 7+ Years)

Location : Gurugram

Employment Type : Full-Time


Job Summary :

Seeking a Tech Lead - Data Engineering with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.


Key Responsibilities :

  • Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
  • Maintain data lakes & warehouses for analytics.
  • Ensure data integrity through quality checks.
  • Collaborate with data scientists & engineers to deliver solutions.

Qualifications :

  • 7+ Years in Data Engineering.
  • Expertise in AWS services, SQL, Python, Spark, Kafka.
  • Experience with CI/CD, DevOps practices.
  • Strong problem-solving skills.

Preferred Skills :

  • Experience with Snowflake, Databricks.
  • Knowledge of BI tools (Tableau, Power BI).
  • Healthcare/Insurance domain experience is a plus.


Read more
NeoGenCode Technologies Pvt Ltd
Gurugram
3 - 8 yrs
₹2L - ₹15L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more

Job Title : Full Stack Developer (Python + React.js)

Location : Gurgaon (Work From Office, 6 days a week)

Experience : 3+ Years


Job Overview :

We are looking for a skilled Full Stack Developer proficient in Python (Django) and React.js to develop scalable web applications. The ideal candidate must have experience in backend and frontend development, database management, and cloud technologies.


Mandatory Skills :

Python, Django (Backend Development)

PostgreSQL (Database Management)

AWS (Cloud Services)

RabbitMQ, Redis, Kafka, Celery (Messaging & Asynchronous Processing)

React.js (Frontend Development)


Key Requirements :

  • 3+ Years of experience in Full Stack Development.
  • Strong expertise in RESTful APIs & Microservices.
  • Experience with CI/CD, Git, and Agile methodologies.
  • Strong problem-solving and communication skills.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
3 - 8 yrs
₹3L - ₹10L / yr
skill iconPython
skill iconDjango
skill iconFlask
MySQL
skill iconPostgreSQL
+6 more

Job Title : Python Django Developer

Location : Gurgaon (On-site)

Work Mode : 6 Days a Week (Work from Office)

Experience Level : 3+ Years


About the Role :

We are seeking a highly skilled and motivated Python Django Developer to join our team in Gurgaon. This role requires a hands-on developer with expertise in building scalable web applications and APIs using Python and Django. The ideal candidate will have a strong background in relational databases, message brokers, and distributed systems.


Key Responsibilities :

  • Design, develop, and maintain robust, scalable, and secure web applications using Python and Django.
  • Build and optimize back-end services, RESTful APIs, and integrations with third-party tools.
  • Implement and maintain asynchronous task processing using Celery and RabbitMQ.
  • Work with PostgreSQL to design and optimize database schemas and queries.
  • Utilize Redis and Kafka for caching, data streaming, and other distributed system needs.
  • Debug and troubleshoot issues across the application stack.
  • Collaborate with cross-functional teams to gather requirements and deliver solutions.
  • Ensure code quality through comprehensive testing, code reviews, and adherence to best practices.


Required Skills and Qualifications:

Technical Expertise:

  • Proficiency in Python and strong experience with Django framework.
  • Hands-on experience with PostgreSQL for database design and management.
  • Familiarity with RabbitMQCelery, and Redis for asynchronous processing and caching.
  • Experience with Kafka for building real-time data pipelines and event-driven architectures.

Other Skills:

  • Strong understanding of software development best practices and design patterns.
  • Proficiency in writing efficient, reusable, and testable code.
  • Good knowledge of Linux/Unix environments.
  • Familiarity with Docker and containerized deployments is a plus.

Soft Skills:

  • Excellent problem-solving and analytical skills.
  • Good communication and teamwork abilities.
  • Ability to work independently and in a collaborative team environment.

Preferred Qualifications:

  • Experience in microservices architecture.
  • Exposure to DevOps tools and practices.
  • Knowledge of front-end technologies like React or Angular is a bonus.
Read more
Rigel Networks Pvt Ltd
Minakshi Soni
Posted by Minakshi Soni
Bengaluru (Bangalore), Pune, Mumbai, Chennai
8 - 12 yrs
₹8L - ₹10L / yr
skill iconAmazon Web Services (AWS)
Terraform
Amazon Redshift
Redshift
Snowflake
+16 more

Dear Candidate,


We are urgently Hiring AWS Cloud Engineer for Bangalore Location.

Position: AWS Cloud Engineer

Location: Bangalore

Experience: 8-11 yrs

Skills: Aws Cloud

Salary: Best in Industry (20-25% Hike on the current ctc)

Note:

only Immediate to 15 days Joiners will be preferred.

Candidates from Tier 1 companies will only be shortlisted and selected

Candidates' NP more than 30 days will get rejected while screening.

Offer shoppers will be rejected.


Job description:

 

Description:

 

Title: AWS Cloud Engineer

Prefer BLR / HYD – else any location is fine

Work Mode: Hybrid – based on HR rule (currently 1 day per month)


Shift Timings 24 x 7 (Work in shifts on rotational basis)

Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.

Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting



Experience and Skills Requirements:


Experience:

8 years of experience in a technical role working with AWS


Mandatory

Technical troubleshooting and problem solving

AWS management of large-scale IaaS PaaS solutions

Cloud networking and security fundamentals

Experience using containerization in AWS

Working Data warehouse knowledge Redshift and Snowflake preferred

Working with IaC – Terraform and Cloud Formation

Working understanding of scripting languages including Python and Shell

Collaboration and communication skills

Highly adaptable to changes in a technical environment

 

Optional

Experience using monitoring and observer ability toolsets inc. Splunk, Datadog

Experience using Github Actions

Experience using AWS RDS/SQL based solutions

Experience working with streaming technologies inc. Kafka, Apache Flink

Experience working with a ETL environments

Experience working with a confluent cloud platform


Certifications:


Minimum

AWS Certified SysOps Administrator – Associate

AWS Certified DevOps Engineer - Professional



Preferred


AWS Certified Solutions Architect – Associate


Responsibilities:


Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.


The following is a list of expected responsibilities:


To manage and support a customer’s AWS platform

To be technical hands on

Provide Incident and Problem management on the AWS IaaS and PaaS Platform

Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner

Actively monitor an AWS platform for technical issues

To be involved in the resolution of technical incidents tickets

Assist in the root cause analysis of incidents

Assist with improving efficiency and processes within the team

Examining traces and logs

Working with third party suppliers and AWS to jointly resolve incidents


Good to have:


Confluent Cloud

Snowflake




Best Regards,

Minakshi Soni

Executive - Talent Acquisition (L2)

Rigel Networks

Worldwide Locations: USA | HK | IN 

Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Gurugram
3 - 8 yrs
₹2L - ₹12L / yr
skill iconDjango
skill iconPostgreSQL
RabbitMQ
skill iconPython
skill iconRedis
+2 more

Job Title : Python Django Developer

Location : Gurgaon (On-site)

Work Mode : 6 Days a Week (Work from Office)

Experience Level : 3+ Years


About the Role :

We are seeking a highly skilled and motivated Python Django Developer to join our team in Gurgaon. This role requires a hands-on developer with expertise in building scalable web applications and APIs using Python and Django. The ideal candidate will have a strong background in relational databases, message brokers, and distributed systems.


Key Responsibilities :

  • Design, develop, and maintain robust, scalable, and secure web applications using Python and Django.
  • Build and optimize back-end services, RESTful APIs, and integrations with third-party tools.
  • Implement and maintain asynchronous task processing using Celery and RabbitMQ.
  • Work with PostgreSQL to design and optimize database schemas and queries.
  • Utilize Redis and Kafka for caching, data streaming, and other distributed system needs.
  • Debug and troubleshoot issues across the application stack.
  • Collaborate with cross-functional teams to gather requirements and deliver solutions.
  • Ensure code quality through comprehensive testing, code reviews, and adherence to best practices.


Required Skills and Qualifications:

Technical Expertise:

  • Proficiency in Python and strong experience with Django framework.
  • Hands-on experience with PostgreSQL for database design and management.
  • Familiarity with RabbitMQ, Celery, and Redis for asynchronous processing and caching.
  • Experience with Kafka for building real-time data pipelines and event-driven architectures.

Other Skills:

  • Strong understanding of software development best practices and design patterns.
  • Proficiency in writing efficient, reusable, and testable code.
  • Good knowledge of Linux/Unix environments.
  • Familiarity with Docker and containerized deployments is a plus.

Soft Skills:

  • Excellent problem-solving and analytical skills.
  • Good communication and teamwork abilities.
  • Ability to work independently and in a collaborative team environment.

Preferred Qualifications:

  • Experience in microservices architecture.
  • Exposure to DevOps tools and practices.
  • Knowledge of front-end technologies like React or Angular is a bonus.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Mumbai
10 - 15 yrs
Best in industry
skill iconJava
Apache Kafka
skill iconMongoDB
J2EE
skill iconSpring Boot
+7 more

Java Technical Lead


We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team.


A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.


Experience: 10+ years

Location: Mumbai


Job Description:


• Experience in Core Java, Spring Boot.

• Experience in microservices.

• Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.

• Should be able to analyze, design, develop and test complex, low-latency client-facing applications.

• Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2

• Good knowledge of multi-threading

• Basic working knowledge of Unix/Linux

• Excellent problem solving and coding skills in Java

• Strong interpersonal, communication and analytical skills.

• Should be able to express their design ideas and thoughts.


About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.


Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.


Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.


Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.


We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

Read more
Affine
Rishika Chadha
Posted by Rishika Chadha
Remote only
5 - 8 yrs
Best in industry
skill iconScala
ETL
Apache Kafka
Object Oriented Programming (OOPs)
CI/CD
+4 more

Role Objective:


Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products


Roles & Responsibilities:

  • Sound knowledge in Spark architecture and distributed computing and Spark streaming.
  • Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
  • SFDC(Data modelling experience) would be given preference
  • Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
  • Good in functional programming and OOPS concept on Scala
  • Good experience in SQL – should be able to write complex queries.
  • Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
  • Able to mentor new members for onboarding to the project.
  • Understand the client requirement and able to design, develop from scratch and deliver.
  • AWS cloud experience would be preferable.
  • Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
  • Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
  • Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
  • Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
  • Managing project timing, client expectations and meeting deadlines.
  • Should have played project and team management roles.
  • Facilitate meetings within the team on regular basis.
  • Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
  • Optimization, maintenance, and support of pipelines.
  • Strong analytical and logical skills.
  • Ability to comfortably tackling new challenges and learn
Read more
Affine
Jeeba P
Posted by Jeeba P
Remote only
3 - 8 yrs
Best in industry
skill iconScala
Spark
Apache Kafka
SQL
skill iconAmazon Web Services (AWS)

Role Objective:


Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products


Roles & Responsibilities:

  • Sound knowledge in Spark architecture and distributed computing and Spark streaming.
  • Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
  • Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
  • Good in functional programming and OOPS concept on Scala
  • Good experience in SQL – should be able to write complex queries.
  • Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
  • Able to mentor new members for onboarding to the project.
  • Understand the client requirement and able to design, develop from scratch and deliver.
  • AWS cloud experience would be preferable.
  • Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
  • Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
  • Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
  • Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
  • Managing project timing, client expectations and meeting deadlines.
  • Should have played project and team management roles.
  • Facilitate meetings within the team on regular basis.
  • Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
  • Optimization, maintenance, and support of pipelines.
  • Strong analytical and logical skills.
  • Ability to comfortably tackling new challenges and learn

External Skills And Expertise

Must have Skills:

  • Scala
  • Spark
  • SQL (Intermediate to advanced level)
  • Spark Streaming
  • AWS preferable/Any cloud
  • Kafka /Kinesis/Any streaming services
  • Object-Oriented Programming
  • Hive, ETL/ELT design experience
  • CICD experience (ETL pipeline deployment)

Good to Have Skills:

  • AWS Certification
  • Git/similar version control tool
  • Knowledge in CI/CD, Microservices


Read more
Solix Technologies

at Solix Technologies

3 recruiters
Sumathi Arramraju
Posted by Sumathi Arramraju
Hyderabad
3 - 7 yrs
₹6L - ₹12L / yr
Hadoop
skill iconJava
HDFS
Spring
Spark
+1 more
Primary Skills required: Java, J2ee, JSP, Servlets, JDBC, Tomcat, Hadoop (hdfs, map reduce, hive, hbase, spark, impala) 
Secondary Skills: Streaming, Archiving , AWS / AZURE / CLOUD

Role:
·         Should have strong programming and support experience in Java, J2EE technologies 
·         Should have good experience in Core Java, JSP, Sevlets, JDBC
·         Good exposure in Hadoop development ( HDFS, Map Reduce, Hive, HBase, Spark)
·         Should have 2+ years of Java experience and 1+ years of experience in Hadoop 
·         Should possess good communication skills
·         Web Services or Elastic \ Map Reduce 
·         Familiarity with data-loading tools such as Sqoop
·         Good to know: Spark, Storm, Apache HBase
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Mumbai
7 - 9 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Apache Kafka
+1 more

We are looking for an experienced Java Developer with strong proficiency in Kafka and MongoDB to join our dynamic team. The ideal candidate will have a solid background in designing and developing high-performance, scalable, and reliable applications in a microservices architecture. You will be responsible for building real-time data processing systems, integrating various services, and ensuring smooth data flow across systems.

Key Responsibilities:

  • Design, develop, and maintain scalable Java applications with a focus on performance and reliability.
  • Build and maintain Kafka-based real-time data pipelines for handling high-volume, low-latency data.
  • Work with MongoDB to design and optimize database schemas and queries for high throughput and availability.
  • Collaborate with cross-functional teams to define, design, and implement new features and improvements.
  • Troubleshoot and resolve issues related to system performance, scalability, and reliability.
  • Ensure software quality through best practices, including testing, code reviews, and continuous integration.
  • Implement and maintain security best practices in both code and data handling.
  • Participate in agile development cycles, including sprint planning, daily standups, and retrospectives.

Required Skills & Qualifications:

  • 7+ years of experience in Java development, with a strong understanding of core Java concepts (J2EE, multithreading, etc.).
  • Hands-on experience with Apache Kafka, including setting up brokers, producers, consumers, and understanding Kafka Streams.
  • Proficient in working with MongoDB for designing efficient data models, indexing, and optimizing queries.
  • Experience with microservices architecture and RESTful APIs.
  • Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes is a plus.
  • Strong understanding of distributed systems, message-driven architectures, and event streaming.
  • Familiarity with version control systems like Git.
  • Excellent problem-solving skills, with the ability to debug and optimize code for high-performance systems.
  • Experience with CI/CD pipelines and automated testing.
Read more
Posspole
Vibha Shashidhar
Posted by Vibha Shashidhar
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹14L / yr
Apache Kafka
IBM DB2
  • Strong knowledge in Kafka development and architecture.
  • Hands-on experience on KSQL Database.
  • Very good communication, analytical & problem-solving. skills.
  • Proven hands-on Development experience Kafka platforms, lenses, confluent.
  • Strong knowledge of the framework (Kafka Connect).
  • Very comfortable with Shell scripting & Linux commands.
  • Experience in DB2 database


Read more
Remote only
4 - 8 yrs
₹4L - ₹8L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+8 more

Your Opportunity Join our dynamic team as a Full Stack Software Dev, where you'll work at the intersection of innovation and leadership. You'll be part of a passionate group of engineers dedicated to building cutting-edge SaaS solutions that solve real customer challenges. This role is perfect for an experienced engineer who thrives on managing teams, collaborating with leadership, and driving product development. You’ll work directly with the CEO and senior architects, ensuring that our products meet the highest design and performance standards.

Key Responsibilities

  • Lead, manage, and mentor a team of engineers to deliver scalable, high-performance solutions.
  • Coordinate closely with the CEO and product leadership to align on goals and drive the vision forward.
  • Collaborate with distributed teams to design, build, and refine core product features that serve a global audience.
  • Stay hands-on with coding and architecture, driving key services and technical initiatives from end to end.
  • Troubleshoot, debug, and optimize existing systems to ensure smooth product operations.
  • Requirements & Technical Skills
  • Bachelor's/Master’s/PhD in Computer Science, Engineering, or related fields (B.Tech, M.Tech, B.CA, B.E./M.E).
  • 4 to 8 years of hands-on experience as a software developer, ideally in a SaaS environment.
  • Proven track record in developing scalable, distributed systems and services.
  • Solid understanding of the Software Development Lifecycle (SDLC).
  • Strong programming experience in Spring & Hibernate with Kotlin, React, Nest.js, Python, and Shell scripting.
  • Expertise in Unix-based systems, container technologies, and virtual machines.
  • Knowledge of both relational and non-relational databases (MySQL, PostgreSQL, MongoDB, DocumentDB). Preferred Qualifications
  • Familiarity with Agile methodologies.
  • Experience working on both structured and unstructured data sources. Soft Skills
  • Strong leadership, coaching, and mentoring capabilities to inspire and guide a team of engineers.
  • Excellent communication skills, with the ability to present complex technical concepts clearly to non-technical stakeholders.
  • Adaptable to new technologies in a fast-paced environment.


Read more
Aadvi tech
Sravan Kumar
Posted by Sravan Kumar
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad
4 - 10 yrs
₹15L - ₹30L / yr
Kafka
Cucumber
skill iconJava
Test Automation (QA)
Selenium
+1 more


Job description:

  • Hands on skills in Java programming languages
  • Experience of testing of Cloud Native applications with exposure of Kafka.
  • Understanding the concepts of K8, Caching, Rest / GRPC and Observability
  • Experience with good programming or scripting practices and tools: code review, ADO/Jenkin etc
  • Apply expertise in Java, API Testing, Cucumber or other test frameworks to design, develop and maintain automation test suites.
  • Intimate familiarity with QA concepts: white-/black-/grey-box testing, acceptance/regression test, system integration test, performance/stress test, and security tests
Read more
Freestone Infotech Pvt. Ltd.

at Freestone Infotech Pvt. Ltd.

1 video
7 recruiters
Pratima Jadhav
Posted by Pratima Jadhav
Remote only
5 - 8 yrs
Best in industry
skill iconSpring Boot
Microservices
skill iconJava
J2EE
Hibernate (Java)
+4 more

Core Experience:

•Experience in Core Java, J2EE, Spring/Spring Boot, Hibernate, Spring REST, Linux, JUnit, Maven, Design Patterns.

• Sound knowledge of RDBMS like MySQL/Postgres, including schema design.

• Exposure to Linux environment.

• Exposure to Docker and Kubernetes.

• Basic Knowledge of Cloud Services of AWS, Azure, GCP cloud provider.

• Proficient in general programming, logic, problem solving, data structures & algorithms

• Good analytical, grasping and problem-solving skills.


Secondary Skills:

• Agile / Scrum Development Experience preferred.

• Comfortable working with a microservices architecture and familiarly with NoSql solutions.

• Experience in Test Driven Development.

• Excellent written and verbal communication skills.

• Hands-on skills in configuration of popular build tools, like Maven and Gradle

• Good knowledge of testing frameworks such as JUnit.

• Good knowledge of coding standards, source code organization and packaging/deploying.

• Good knowledge of current and emerging technologies and trends.


Job Responsibilities:

• Design, Development and Delivery of Java based enterprise-grade applications.

• Ensure best practices, quality and consistency within various design and development phases.

• Develop, test, implement and maintain application software working with established processes.

Work with QA and help them for test automation

• Work with Technical Writer and help them documenting the features you have developing.

 

Education and Experience:

• Bachelor’s / master’s degree in computer science or information technology or related field

Read more
Cargill Business Services
Paramjit Kaur
Posted by Paramjit Kaur
Bengaluru (Bangalore)
2 - 6 yrs
Best in industry
Apache Kafka
Kerberos
Zookeeper
Terraform
Linux administration

As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.  


  • Develop and recommend improvements to standard and moderately complex application support processes and procedures. 
  • Review, analyze and prioritize incoming incident tickets and user requests. 
  • Perform programming, configuration, testing and deployment of fixes or updates for application version releases. 
  • Implement security processes to protect data integrity and ensure regulatory compliance. 
  • Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs. 


MINIMUM QUALIFICATIONS

  • 2-4 year of minimum experience
  • Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
  • Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
  • Experience implementing Kerberos security
  • Preferred:
  • Experience in Linux system administration
  • Authentication plugin experience such as basic, SSL, and Kerberos
  • Production incident support including root cause analysis
  • AWS EC2
  • Terraform
Read more
PortOne
Remote, Pune
2 - 4 yrs
Best in industry
DevOps
skill iconKubernetes
skill iconDocker
skill iconAmazon Web Services (AWS)
Distributed Systems
+13 more

PortOne is re−imagining payments in Korea and other international markets. We are a Series B funded startup backed by prominent VC firms Softbank and Hanwa Capital


https://portone.io/global/en


PortOne provides a unified API for merchants to integrate with and manage all of the payment options available in Korea and SEA Markets - Thailand, Singapore, Indonesia etc. It's currently used by 2000+ companies and processing multi-billion dollars in annualized volume. We are building a team to take this product to international markets, and looking for engineers with a passion for fintech and digital payments.


Culture and Values at PortOne

  • You will be joining a team that stands for Making a difference.
  • You will be joining a culture that identifies more with Sports Teams rather than a 9 to 5 workplace.
  • This will be remote role that allows you flexibility to save time on commute
  • Your will have peers who are/have
  • Highly Self Driven with A sense of purpose
  • High Energy Levels - Building stuff is your sport
  • Ownership - Solve customer problems end to end - Customer is your Boss
  • Hunger to learn - Highly motivated to keep developing new tech skill sets



Who you are ?


* You are an athlete and Devops/DevSecOps is your sport.

* Your passion drives you to learn and build stuff and not because your manager tells you to.

* Your work ethic is that of an athlete preparing for your next marathon. Your sport drives you and you like being in the zone.

* You are NOT a clockwatcher renting out your time, and NOT have an attitude of "I will do only what is asked for"

* Enjoys solving problems and delight users both internally and externally

* Take pride in working on projects to successful completion involving a wide variety of technologies and systems

* Posses strong & effective communication skills and the ability to present complex ideas in a clear & concise way

* Responsible, self-directed, forward thinker, and operates with focus, discipline and minimal supervision

* A team player with a strong work ethic


Experience


* 2+ year of experience working as a Devops/DevSecOps Engineer

* BE in Computer Science or equivalent combination of technical education and work experience

* Must have actively managed infrastructure components & devops for high quality and high scale products

* Proficient knowledge and experience on infra concepts - Networking/Load Balancing/High Availability

* Experience on designing and configuring infra in cloud service providers - AWS / GCP / AZURE

* Knowledge on Secure Infrastructure practices and designs

* Experience with DevOps, DevSecOps, Release Engineering, and Automation

* Experience with Agile development incorporating TDD / CI / CD practices


Hands on Skills


* Proficient in atleast one high level Programming Language: Go / Java / C

* Proficient in scripting - bash scripting etc - to build/glue together devops/datapipeline workflows

* Proficient in Cloud Services - AWS / GCP / AZURE

* Hands on experience on CI/CD & relevant tools - Jenkins / Travis / Gitops / SonarQube / JUnit / Mock frameworks

* Hands on experience on Kubenetes ecosystem & container based deployments - Kubernetes / Docker / Helm Charts / Vault / Packer / lstio / Flyway

* Hands on experience on Infra as code frameworks - Terraform / Crossplane / Ansible

* Version Control & Code Quality: Git / Github / Bitbucket / SonarQube

* Experience on Monitoring Tools: Elasticsearch / Logstash / Kibana / Prometheus / Grafana / Datadog / Nagios

* Experience with RDBMS Databases & Caching services: Postgres / MySql / Redis / CDN

* Experience with Data Pipelines/Worflow tools: Airflow / Kafka / Flink / Pub-Sub

* DevSecOps - Cloud Security Assessment, Best Practices & Automation

* DevSecOps - Vulnerabiltiy Assessments/Penetration Testing for Web, Network and Mobile applications

* Preferrable to have Devops/Infra Experience for products in Payments/Fintech domain - Payment Gateways/Bank integrations etc



What will you do ?


Devops

* Provisioning the infrastructure using Crossplane/Terraform/Cloudformation scripts.

* Creating and Managing the AWS EC2, RDS, EKS, S3, VPC, KMS and IAM services, EKS clusters & RDS Databases.

* Monitor the infra to prevent outages/downtimes and honor our infra SLAs

* Deploy and manage new infra components.

* Update and Migrate the clusters and services.

* Reducing the cloud cost by enabling/scheduling for less utilized instances.

* Collaborate with stakeholders across the organization such as experts in - product, design, engineering

* Uphold best practices in Devops/DevSecOps and Infra management with attention to security best practices


DevSecOps

* Cloud Security Assessment & Automation

* Modify existing infra to adhere to security best practices

* Perform Threat Modelling of Web/Mobile applications

* Integrate security testing tools (SAST, DAST) in to CI/CD pipelines

* Incident management and remediation - Monitoring security incidents, recovery from and remediation of the issues

* Perform frequent Vulnerabiltiy Assessments/Penetration Testing for Web, Network and Mobile applications

* Ensure the environment is compliant to CIS, NIST, PCI etc.




Here are examples of apps/features you will be supporting as a Devops/DevSecOps Engineer

* Intuitive, easy-to-use APIs for payment process.

* Integrations with local payment gateways in international markets.

* Dashboard to manage gateways and transactions.

* Analytics platform to provide insights

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort