Cutshort logo
Apache Kafka Jobs in Chennai

23+ Apache Kafka Jobs in Chennai | Apache Kafka Job openings in Chennai

Apply to 23+ Apache Kafka Jobs in Chennai on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

icon
NeoGenCode Technologies Pvt Ltd
Shivank Bhardwaj
Posted by Shivank Bhardwaj
Remote, Bengaluru (Bangalore), Pune, Chennai
10 - 15 yrs
₹10L - ₹55L / yr
skill iconJava
skill iconSpring Boot
skill iconAmazon Web Services (AWS)
Microservices
High-level design
+5 more

We are looking for a highly skilled Solution Architect with a passion for software engineering and deep experience in backend technologies, cloud, and DevOps. This role will be central in managing, designing, and delivering large-scale, scalable solutions.


Core Skills

  • Strong coding and software engineering fundamentals.
  • Experience in large-scale custom-built applications and platforms.
  • Champion of SOLID principles, OO design, and pair programming.
  • Agile, Lean, and Continuous Delivery – CI, TDD, BDD.
  • Frontend experience is a plus.
  • Hands-on with Java, Scala, Golang, Rust, Spark, Python, and JS frameworks.
  • Experience with Docker, Kubernetes, and Infrastructure as Code.
  • Excellent understanding of cloud technologies – AWS, GCP, Azure.


Responsibilities

  • Own all aspects of technical development and delivery.
  • Understand project requirements and create architecture documentation.
  • Ensure adherence to development best practices through code reviews.


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Pune, Chennai
10 - 20 yrs
₹30L - ₹60L / yr
skill iconJava
skill iconSpring Boot
Microservices
Apache Kafka
skill iconAmazon Web Services (AWS)
+8 more

📍 Position : Java Architect

📅 Experience : 10 to 15 Years

🧑‍💼 Open Positions : 3+

📍 Work Location : Bangalore, Pune, Chennai

💼 Work Mode : Hybrid

📅 Notice Period : Immediate joiners preferred; up to 1 month maximum

🔧 Core Responsibilities :

  • Lead architecture design and development for scalable enterprise-level applications.
  • Own and manage all aspects of technical development and delivery.
  • Define and enforce best coding practices, architectural guidelines, and development standards.
  • Plan and estimate the end-to-end technical scope of projects.
  • Conduct code reviews, ensure CI/CD, and implement TDD/BDD methodologies.
  • Mentor and lead individual contributors and small development teams.
  • Collaborate with cross-functional teams, including DevOps, Product, and QA.
  • Engage in high-level and low-level design (HLD/LLD), solutioning, and cloud-native transformations.

🛠️ Required Technical Skills :

  • Strong hands-on expertise in Java, Spring Boot, Microservices architecture
  • Experience with Kafka or similar messaging/event streaming platforms
  • Proficiency in cloud platformsAWS and Azure (must-have)
  • Exposure to frontend technologies (nice-to-have)
  • Solid understanding of HLD, system architecture, and design patterns
  • Good grasp of DevOps concepts, Docker, Kubernetes, and Infrastructure as Code (IaC)
  • Agile/Lean development, Pair Programming, and Continuous Integration practices
  • Polyglot mindset is a plus (Scala, Golang, Python, etc.)

🚀 Ideal Candidate Profile :

  • Currently working in a product-based environment
  • Already functioning as an Architect or Principal Engineer
  • Proven track record as an Individual Contributor (IC)
  • Strong engineering fundamentals with a passion for scalable software systems
  • No compromise on code quality, craftsmanship, and best practices

🧪 Interview Process :

  1. Round 1: Technical pairing round
  2. Rounds 2 & 3: Technical rounds with panel (code pairing + architecture)
  3. Final Round: HR and offer discussion
Read more
Zenius IT Services Pvt Ltd

at Zenius IT Services Pvt Ltd

2 candid answers
Sunita Pradhan
Posted by Sunita Pradhan
Bengaluru (Bangalore), Chennai
5 - 10 yrs
₹10L - ₹15L / yr
snowflake
SQL
data intergration tool
ETL/ELT Pipelines
SQL Queries
+5 more

Job Summary


We are seeking a skilled Snowflake Developer to design, develop, migrate, and optimize Snowflake-based data solutions. The ideal candidate will have hands-on experience with Snowflake, SQL, and data integration tools to build scalable and high-performance data pipelines that support business analytics and decision-making.


Key Responsibilities:


Develop and implement Snowflake data warehouse solutions based on business and technical requirements.

Design, develop, and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and processing.

Write and optimize complex SQL queries for data retrieval, performance enhancement, and storage optimization.

Collaborate with data architects and analysts to create and refine efficient data models.

Monitor and fine-tune Snowflake query performance and storage optimization strategies for large-scale data workloads.

Ensure data security, governance, and access control policies are implemented following best practices.

Integrate Snowflake with various cloud platforms (AWS, Azure, GCP) and third-party tools.

Troubleshoot and resolve performance issues within the Snowflake environment to ensure high availability and scalability.

Stay updated on Snowflake best practices, emerging technologies, and industry trends to drive continuous improvement.


Qualifications:

Education: Bachelor’s or master’s degree in computer science, Information Systems, or a related field.


Experience:


6+ years of experience in data engineering, ETL development, or similar roles.

3+ years of hands-on experience in Snowflake development.


Technical Skills:


Strong proficiency in SQL, Snowflake Schema Design, and Performance Optimization.

Experience with ETL/ELT tools like dbt, Talend, Matillion, or Informatica.

Proficiency in Python, Java, or Scala for data processing.

Familiarity with cloud platforms (AWS, Azure, GCP) and integration with Snowflake.

Experience with data governance, security, and compliance best practices.

Strong analytical, troubleshooting, and problem-solving skills.

Communication: Excellent communication and teamwork abilities, with a focus on collaboration across teams.


Preferred Skills:


Snowflake Certification (e.g., SnowPro Core or Advanced).

Experience with real-time data streaming using tools like Kafka or Apache Spark.

Hands-on experience with CI/CD pipelines and DevOps practices in data environments.

Familiarity with BI tools like Tableau, Power BI, or Looker for data visualization and reporting.

Read more
Wekan Enterprise Solutions

at Wekan Enterprise Solutions

2 candid answers
Deepak  N
Posted by Deepak N
Bengaluru (Bangalore), Chennai
12 - 22 yrs
Best in industry
skill iconNodeJS (Node.js)
skill iconMongoDB
Microservices
skill iconJavascript
TypeScript
+3 more

Architect


Experience - 12+ yrs


About Wekan Enterprise Solutions


Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.

 

Job Description

We are looking for passionate architects eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?

You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.

Location - Chennai or Bangalore


●     Relevant experience of 12+ years building high-performance applications with at least 3+ years as an architect.

●     Good problem solving skills

●     Strong mentoring capabilities

●     Good understanding of software development life cycle

●     Strong experience in system design and architecture

●     Strong focus on quality of work delivered

●     Excellent verbal and written communication skills

 

Required Technical Skills

 

● Extensive hands-on experience building high-performance applications using Node.Js (Javascript/Typescript) and .NET/ Golang / Java / Python.

● Strong experience with appropriate framework(s).

● Wellversed in monolithic and microservices architecture.

● Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases

● Experience working with 3rd party integrations ranging from authentication, cloud services, etc.

● Hands-on experience with Kafka or RabbitMQ.

● Handsonexperience with CI/CD pipelines and atleast 1 cloud provider- AWS / GCP / Azure

● Strong experience writing and maintaining clear documentation

  

Good to have skills:

 

●     Experience working with frontend technologies - React.Js or Vue.Js or Angular.

●     Extensive experience consulting with customers directly for defining architecture or system design.

●     Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies

Read more
Chennai
5 - 7 yrs
₹15L - ₹25L / yr
Apache Kafka
Google Cloud Platform (GCP)
BCP
DevOps

Job description

 Location: Chennai, India

 Experience: 5+ Years

 Certification: Kafka Certified (Mandatory); Additional Certifications are a Plus


Job Overview:

We are seeking an experienced DevOps Engineer specializing in GCP Cloud Infrastructure Management and Kafka Administration. The ideal candidate should have 5+ years of experience in cloud technologies, Kubernetes, and Kafka, with a mandatory Kafka certification.


Key Responsibilities:

Cloud Infrastructure Management:

· Manage and update Kubernetes (K8s) on GKE.

· Monitor and optimize K8s resources, including pods, storage, memory, and costs.

· Oversee the general monitoring and maintenance of environments using:

o OpenSearch / Kibana

o KafkaUI

o BGP

o Grafana / Prometheus


Kafka Administration:

· Manage Kafka brokers and ACLs.

· Hands-on experience in Kafka administration (preferably Confluent Kafka).

· Independently debug, optimize, and implement Kafka solutions based on developer and business needs.


Other Responsibilities:

· Perform random investigations to troubleshoot and enhance infrastructure.

· Manage PostgreSQL databases efficiently.

· Administer Jenkins pipelines, supporting CI/CD implementation and maintenance.


Required Skills & Qualifications:

· Kafka Certified Engineer (Mandatory).

· 5+ years of experience in GCP DevOps, Cloud Infrastructure, and Kafka Administration.

· Strong expertise in Kubernetes (K8s), Google Kubernetes Engine (GKE), and cloud environments.

· Hands-on experience with monitoring tools like Grafana, Prometheus, OpenSearch, and Kibana.

· Experience managing PostgreSQL databases.

· Proficiency in Jenkins pipeline administration.

· Ability to work independently and collaborate with developers and business stakeholders.

If you are passionate about DevOps, Cloud Infrastructure, and Kafka, and meet the above qualifications, we encourage you to apply!


Read more
Rigel Networks Pvt Ltd
Minakshi Soni
Posted by Minakshi Soni
Bengaluru (Bangalore), Pune, Mumbai, Chennai
8 - 12 yrs
₹8L - ₹10L / yr
skill iconAmazon Web Services (AWS)
Terraform
Amazon Redshift
Redshift
Snowflake
+16 more

Dear Candidate,


We are urgently Hiring AWS Cloud Engineer for Bangalore Location.

Position: AWS Cloud Engineer

Location: Bangalore

Experience: 8-11 yrs

Skills: Aws Cloud

Salary: Best in Industry (20-25% Hike on the current ctc)

Note:

only Immediate to 15 days Joiners will be preferred.

Candidates from Tier 1 companies will only be shortlisted and selected

Candidates' NP more than 30 days will get rejected while screening.

Offer shoppers will be rejected.


Job description:

 

Description:

 

Title: AWS Cloud Engineer

Prefer BLR / HYD – else any location is fine

Work Mode: Hybrid – based on HR rule (currently 1 day per month)


Shift Timings 24 x 7 (Work in shifts on rotational basis)

Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.

Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting



Experience and Skills Requirements:


Experience:

8 years of experience in a technical role working with AWS


Mandatory

Technical troubleshooting and problem solving

AWS management of large-scale IaaS PaaS solutions

Cloud networking and security fundamentals

Experience using containerization in AWS

Working Data warehouse knowledge Redshift and Snowflake preferred

Working with IaC – Terraform and Cloud Formation

Working understanding of scripting languages including Python and Shell

Collaboration and communication skills

Highly adaptable to changes in a technical environment

 

Optional

Experience using monitoring and observer ability toolsets inc. Splunk, Datadog

Experience using Github Actions

Experience using AWS RDS/SQL based solutions

Experience working with streaming technologies inc. Kafka, Apache Flink

Experience working with a ETL environments

Experience working with a confluent cloud platform


Certifications:


Minimum

AWS Certified SysOps Administrator – Associate

AWS Certified DevOps Engineer - Professional



Preferred


AWS Certified Solutions Architect – Associate


Responsibilities:


Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.


The following is a list of expected responsibilities:


To manage and support a customer’s AWS platform

To be technical hands on

Provide Incident and Problem management on the AWS IaaS and PaaS Platform

Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner

Actively monitor an AWS platform for technical issues

To be involved in the resolution of technical incidents tickets

Assist in the root cause analysis of incidents

Assist with improving efficiency and processes within the team

Examining traces and logs

Working with third party suppliers and AWS to jointly resolve incidents


Good to have:


Confluent Cloud

Snowflake




Best Regards,

Minakshi Soni

Executive - Talent Acquisition (L2)

Rigel Networks

Worldwide Locations: USA | HK | IN 

Read more
Egen Solutions
Anshul Saxena
Posted by Anshul Saxena
Remote, Hyderabad, Ahmedabad, Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Kolkata, Indore, Bhopal, Kochi (Cochin), Chennai, Bengaluru (Bangalore), Pune
3 - 5 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconKotlin
+3 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.


You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.


Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Chennai
4 - 7 yrs
₹13L - ₹15L / yr
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+10 more

Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA

Responsibilities:

  • Parse data using Python, create dashboards in Tableau.
  • Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
  • Migrate Datastage jobs to Snowflake, optimize performance.
  • Work with HDFS, Hive, Kafka, and basic Spark.
  • Develop Python scripts for data parsing, quality checks, and visualization.
  • Conduct unit testing and web application testing.
  • Implement Apache Airflow and handle production migration.
  • Apply data warehousing techniques for data cleansing and dimension modeling.

Requirements:

  • 4+ years of experience as a Platform Engineer.
  • Strong Python skills, knowledge of Tableau.
  • Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
  • Proficient in Unix Shell Scripting and SQL.
  • Familiarity with ETL tools like DataStage and DMExpress.
  • Understanding of Apache Airflow.
  • Strong problem-solving and communication skills.

Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.

Read more
Bengaluru (Bangalore), Hyderabad, Pune, Chennai, Jaipur
10 - 14 yrs
₹1L - ₹15L / yr
Ant
Maven
CI/CD
skill iconJenkins
skill iconGitHub
+16 more

DevOps Architect 

Experience:  10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.

Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired

Technical Skillset: Skills Proficiency level

  • Build tools (Ant or Maven) - Expert
  • CI/CD tool (Jenkins or Github CI/CD) - Expert
  • Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
  • Infrastructure As Code (Terraform, Helm charts etc.) - Expert
  • Containerization (Docker, Docker Registry) - Expert
  • Scripting (linux) - Expert
  • Cluster deployment (Kubernetes) & maintenance - Expert
  • Programming (Java) - Intermediate
  • Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
  • Artifactory (JFrog) - Expert
  • Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
  • Ansible, MySQL, PostgreSQL - Intermediate


• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)

Roles and Responsibilities

• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Bengaluru (Bangalore), Chennai, Delhi, Mumbai
5 - 10 yrs
₹1L - ₹10L / yr
Apache Kafka
skill iconSpring Boot
Microservices
skill iconKubernetes
Kafka
Job Position: KAFKA Developer
Relevant Experience: 5+ Years
Payroll Company:  Codersbrain Technology Pvt. Ltd.
Location: 
PAN India
Notice Period: Immediate to 15 Days.
Client: IBM
 
Description:
Total Years of Experience: 5+ yrs Relevant Years of Experience: 5+ yrs Mandatory Skills for screening (Limit to top 5 and include version): KAFKA Good to have (Not Mandatory): Detailed Job Description:   Kafka Developer should have - 4 to 5 years of development experience using Confluent Kafka 4 to 5 years of experience in developing microservices using Springboot and Kafka Should have strong experience in developing CI/CD for Spring boot applications and deploying in the Kubernetes environment.  Experience in Kubernetes is MUST (edited)  Experience in using MQ, Oracle source, and Sink Connectors Experience in Kafka performance testing Nice to have experience in OpenShift Nice to have Kafka troubleshooting skills.
Read more
DFCS Technologies

DFCS Technologies

Agency job
via dfcs Technologies by SheikDawood Ali
Remote, Chennai, Anywhere India
1 - 5 yrs
₹9L - ₹14L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
    • data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
    • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Our client is in the field of IT servicing and IT consulting

Our client is in the field of IT servicing and IT consulting

Agency job
via Sapwood Ventures by Sonal Trivedi
Chennai
3 - 7 yrs
₹10L - ₹15L / yr
API
JSON
Apache Kafka
Agile/Scrum
Sonar
+1 more
Role and Responsibilities
  • Analyzes, designs, develops, codes and implements programs in one or more programming languages, for Web and Rich Internet Applications.
  • Supports applications with an understanding of system integration, test planning, scripting, and troubleshooting.
  • Assesses the health and performance of software applications and databases.
  • Establishes, participates, and maintains relationships with business units, customers and subject matter experts in order to remain apprised of direction, project status, architectural and technology trends, risks, and functional/integration issues.
  • Defines specifications and develop programs, modifies existing programs, prepares test data, and prepares functional specifications.
  • Analyzes program and application performance using various programming languages, tools and techniques.
  • Provides guidance to non-technical staff in using software and hardware systems most effectively and efficiently.
  • Reviews project proposals, evaluates alternatives, provides estimates and makes recommendations.
  • Designs and defines specifications for systems.
  • Identifies potential process improvement areas and suggests options and recommends approaches
Candidate Profile
  •         Knowledgeable in software development and design pattern
  •          Swagger, Rabbit MQ, Kafka 
  •          Good API skills technology such as Rest web service and Spring based technology
  •          Good knowledge on Container based application  configurations and deployment preferred env. is OpenShift
  •           Experience on creating unit test using Junit
  •           Experience on markup language such as JSON and YML
  •           Experience on using quality and security scan tools such as Sonar, Fortify
  •           Experience on Agile methodology
  •           7 -10 Years of experience in software development.
Location: Chennai 
 

 
Read more
Telecom  Client

Telecom Client

Agency job
via Eurka IT SOL by Srikanth a
Chennai
5 - 13 yrs
₹9L - ₹28L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more
  • Demonstrable experience owning and developing big data solutions, using Hadoop, Hive/Hbase, Spark, Databricks, ETL/ELT for 5+ years

·       10+ years of Information Technology experience, preferably with Telecom / wireless service providers.

·       Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation

  • DataOps mindset: allowing the architecture of a system to evolve continuously over time, while simultaneously supporting the needs of current users
  • Create and maintain Architectural Runway, and Non-Functional Requirements.
  • Design for Continuous Delivery Pipeline (CI/CD data pipeline) and enables Built-in Quality & Security from the start.

·       To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc

·       The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers

·       Demonstrated ability to work collaboratively

·       Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.)

·       Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision

·       Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems

Read more
NSEIT
Mumbai, Chennai
3 - 6 yrs
₹10L - ₹24L / yr
skill iconReact.js
skill iconJava
skill iconJavascript
Fullstack Developer
IBM WebSphere MQ
+9 more
FSD Software Developer

BDD

Competence Requirement:

1. 3+ years of experience in developing backend Java applications.

2. Experience with Java 11 will be GOOD to have.

3. Experience in front-end development is desired.

4. A self-driven attitude along with a sense of structure and creativeness.

5. Excellent written and spoken English.

6. Bachelor degree in computer science, information technology or software engineering or equivalent

7. Hands-on knowledge and experience of developing financial systems and understanding of financial concepts.



Responsibilities:

1. Write high quality code that solves difficult problems in a highly distributed system with extreme demands on resilience and quality.

2. Perform sufficient tests to ensure at least 80% code coverage.

3. Participate in and contribute to scrum ceremonies, e.g. daily stand-ups, sprint planning, demos and retros.

4. Will be involved in several stages of the product life cycle; design, implementation and testing. At times, also release and deployment.

5. Participate in design discussions and decisions.



Good to have skills:

1. Primary skills – Java 8, spring boot, React, MQ/Messaging services & API, (Java 11, ReactiveX, REST, Swagger/OpenAPI, React/Redux, Gradle, Git, BitBucket, Jenkins)

2. High performance transactional platform

3. Back-end development and Middleware

4. Modern UI based on React

5. Continuous delivery and automation

6. Domain – Capital Market, Investment Banking is good or BFSI is ok
Read more
VIMANA

at VIMANA

4 recruiters
Loshy Chandran
Posted by Loshy Chandran
Remote, Chennai
2 - 5 yrs
₹10L - ₹20L / yr
Data engineering
Data Engineer
Apache Kafka
Big Data
skill iconJava
+4 more

We are looking for passionate, talented and super-smart engineers to join our product development team. If you are someone who innovates, loves solving hard problems, and enjoys end-to-end product development, then this job is for you! You will be working with some of the best developers in the industry in a self-organising, agile environment where talent is valued over job title or years of experience.

 

Responsibilities:

  • You will be involved in end-to-end development of VIMANA technology, adhering to our development practices and expected quality standards.
  • You will be part of a highly collaborative Agile team which passionately follows SAFe Agile practices, including pair-programming, PR reviews, TDD, and Continuous Integration/Delivery (CI/CD).
  • You will be working with cutting-edge technologies and tools for stream processing using Java, NodeJS and Python, using frameworks like Spring, RxJS etc.
  • You will be leveraging big data technologies like Kafka, Elasticsearch and Spark, processing more than 10 Billion events per day to build a maintainable system at scale.
  • You will be building Domain Driven APIs as part of a micro-service architecture.
  • You will be part of a DevOps culture where you will get to work with production systems, including operations, deployment, and maintenance.
  • You will have an opportunity to continuously grow and build your capabilities, learning new technologies, languages, and platforms.

 

Requirements:

  • Undergraduate degree in Computer Science or a related field, or equivalent practical experience.
  • 2 to 5 years of product development experience.
  • Experience building applications using Java, NodeJS, or Python.
  • Deep knowledge in Object-Oriented Design Principles, Data Structures, Dependency Management, and Algorithms.
  • Working knowledge of message queuing, stream processing, and highly scalable Big Data technologies.
  • Experience in working with Agile software methodologies (XP, Scrum, Kanban), TDD and Continuous Integration (CI/CD).
  • Experience using no-SQL databases like MongoDB or Elasticsearch.
  • Prior experience with container orchestrators like Kubernetes is a plus.
About VIMANA

We build products and platforms for the Industrial Internet of Things. Our technology is being used around the world in mission-critical applications - from improving the performance of manufacturing plants, to making electric vehicles safer and more efficient, to making industrial equipment smarter.

Please visit https://govimana.com/ to learn more about what we do.

Why Explore a Career at VIMANA
  • We recognize that our dedicated team members make us successful and we offer competitive salaries.
  • We are a workplace that values work-life balance, provides flexible working hours, and full time remote work options.
  • You will be part of a team that is highly motivated to learn and work on cutting edge technologies, tools, and development practices.
  • Bon Appetit! Enjoy catered breakfasts, lunches and free snacks!

VIMANA Interview Process
We usually target to complete all the interviews in a week's time and would provide prompt feedback to the candidate. As of now, all the interviews are conducted online due to covid situation.

1.Telephonic screening (30 Min )

A 30 minute telephonic interview to understand and evaluate the candidate's fit with the job role and the company.
Clarify any queries regarding the job/company.
Give an overview about further interview rounds

2. Technical Rounds

This would be deep technical round to evaluate the candidate's technical capability pertaining to the job role.

3. HR Round

Candidate's team and cultural fit will be evaluated during this round

We would proceed with releasing the offer if the candidate clears all the above rounds.

Note: In certain cases, we might schedule additional rounds if needed before releasing the offer.
Read more
An IT Services Major, hiring for a leading insurance player.

An IT Services Major, hiring for a leading insurance player.

Agency job
via Indventur Partner by Vanshika kaur
Chennai
3 - 5 yrs
₹5L - ₹10L / yr
Big Data
Hadoop
Apache Kafka
Apache Hive
Microsoft Windows Azure
+1 more

Client  An IT Services Major, hiring for a leading insurance player.

 

 

Position: SENIOR CONSULTANT

 

Job Description:

 

  • Azure admin- senior consultant with HD Insights(Big data)

 

Skills and Experience

 

  • Microsoft Azure Administrator certification
  • Bigdata project experience in Azure HDInsight Stack. big data processing frameworks such as Spark, Hadoop, Hive, Kafka or Hbase.
  • Preferred: Insurance or BFSI domain experience
  • 5 to 5 years of experience is required.
Read more
digital india payments limited
Bhavani Pendyala
Posted by Bhavani Pendyala
Chennai, Hyderabad
3 - 7 yrs
₹4L - ₹12L / yr
skill iconJava
skill iconNodeJS (Node.js)
Fullstack Developer
skill iconReact.js
skill iconRedux/Flux
+13 more
Technology Requirements:
  1. Extensive experience in Javascript / NodeJS in the back end
  2. Front end frameworks such as Bootstrap, Pug, Jquery
  3. Experience in web frameworks like ExpressJS, Webpack
  4. Experience in Nginx, Redis, Apache Kafka and MQTT
  5. Experience with MongoDB
  6. Experience with Version Control Systems like Git / Mercurial
  7. Sound knowledge in Software engineering best practices
  8. Sound knowledge in RestFul API Design
  9. Working knowledge of Automated testing tools
  10. Experience in maintaining production servers (Optional)
  11. Experience with Azure DevOps (Optional)
Soft Skills:
  1. Experience in digital payments or financial services industry is a plus.
  2. Participation in the processes of strategic project-planning meetings.
  3. Be involved and participate in the overall application lifecycle.
  4. Collaborate with External Development Teams.
  5. Define and communicate technical and design requirements, understanding workflows and write code as per requirements.
  6. Develop functional and sustainable web applications with clean codes.
  7. Focus on coding and debugging.
Read more
Online ENT Healthcare giant in India

Online ENT Healthcare giant in India

Agency job
via The Hub by Sridevi Viswanathan
Remote, Bengaluru (Bangalore), Chennai, Hyderabad, Mumbai, Pune
3 - 8 yrs
₹5L - ₹17L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
MySQL
java
+1 more

Software Development Engineer:

Major Responsibilities:

  • Translation of complex functional requirements into technical requirements, implementing and maintaining a coherent and progressive development strategy for our product line
  • Design, develop and maintain complex systems using best of the breed development practices and technology.
  • Responsible for the over-all software development life cycle.
  • Delivery of High Quality, Scalable and Extensible systems and applications on-time and on-budget.
  • Adoption and Evolution of the software engineering practices and tools within the organization
  • Keep in sync with the latest technology developments and open source offerings. Evaluate and adopt them for solving business problem of organization.
  • Collaborate with other technology and business teams within the organization to provide efficient robust solutions to the problems.
  • Drive and manage the bug triage process
  • Report on status of product delivery and quality to management, customer support and product teams.

Desired Skills

  • Strong programming, debugging, and problem-solving skills
  • Strong understanding of data structures and algorithms
  • Sound understanding of object-oriented programming and excellent software design skills.
  • Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / JavaSpringboot applications (API’s).
  • Strong understanding of database design and SQL (mySql/mariaDB) development
  • Good to have knowledge of NoSQL technologies like MongoDB, Solr, Redis, Cassandra or any other NoSQL database
  • Knowledge of design patterns and good to have experience of large-scale applications
  • Should have experience in Apache Kafka, RabbitMQ or other Queueing systems.

Ideal Experience

  • 3 to 8 years of industry experience.
  • Bachelors or Master’s Degree in Computer Science/ IT
  • Drive discussions to create/improve product, process and technology
  • Provide end to end solution and design details
  • Lead development of formalized solution methodologies
  • Passion to work in startup like environment

Personal Characteristics

  • Passion and commitment
  • Strong and excellent software design intellect
  • High integrity
  • Self-starter
Read more
Maveric Systems

at Maveric Systems

3 recruiters
Rashmi Poovaiah
Posted by Rashmi Poovaiah
Bengaluru (Bangalore), Chennai, Pune
4 - 10 yrs
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
Retail Marketing

Retail Marketing

Agency job
via Abbioc by Abbiocs HR
Chennai
4 - 9 yrs
₹1L - ₹12L / yr
skill iconJava
Data Structures
Algorithms
skill iconC++
Apache Kafka
+10 more

Requires a bachelor's degree in area of specialty and experience in the field or in a related area. Familiar with standard concepts, practices, and procedures within a particular field. Relies on experience and judgment to plan and accomplish goals. Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager.

Designs, develops, and implements web-based Java applications to support business requirements. Follows approved life cycle methodologies, creates design documents, and performs program coding and testing. Resolves technical issues through debugging, research, and investigation.

 

Additional Job Details:

Strong in Java, Spring, Spring Boot, REST and developing MicroServices.

Knowledge or experience , Cassandra preferred

Knowledge or experience on Kafka

Good to have but not must

 

Good to know:

Reporting tools like Splunk/Grafana

Protobuf

Python/Ruby

Read more
Lymbyc

at Lymbyc

1 video
2 recruiters
Venky Thiriveedhi
Posted by Venky Thiriveedhi
Bengaluru (Bangalore), Chennai
3 - 5 yrs
₹6L - ₹8L / yr
Microservices
skill iconJava
Apache Kafka
- 3+ years of experience in building complex, highly scalable, high volume, low latency Enterprise applications using languages such as Java, NodeJS, Go and/or Scala - Strong experience in building microservices using technologies like Spring Boot, Spring Cloud, Netflix OSS, Zuul - Deep understanding on microservices design patterns, service registry and discovery, externalization of configurations - Experience in message streaming and processing technologies such as Kafka, Spark, Storm, gRPC or other equivalent technologies - Experience with one or more reactive microservice tools and techniques such as Akka, Vert.x, ReactiveX - Strong experience in creation, management and consumption of REST APIs leveraging Swagger, Postman, API Gateways (such as MuleSoft, Apigee) etc; - Strong knowledge in data modelling, querying, performance tuning of any big-data stores (MongoDB, Elasticsearch, Redis etc;) and /or any RDBMS (Oracle, PostgreSQL, MySQL etc;) - Experience working with Agile / Scrum based teams that utilizes Continuous Integration/Continuous Delivery processes using Git, Maven, Jenkins etc; - Experience in Containers (Docker/Kubernetes) based deployment and management - Experience in using AWS/GCP/Azure based cloud infrastructure - Knowledge in test Driven Development and test automation skills with Junit/TestNG - Knowledge in security frameworks, concepts and technologies like Spring Security, OAuth2, SAML, SSO, Identity and Access Management
Read more
Lymbyc

at Lymbyc

1 video
2 recruiters
Venky Thiriveedhi
Posted by Venky Thiriveedhi
Bengaluru (Bangalore), Chennai
4 - 8 yrs
₹9L - ₹14L / yr
Apache Spark
Apache Kafka
Druid Database
Big Data
Apache Sqoop
+5 more
Key skill set : Apache NiFi, Kafka Connect (Confluent), Sqoop, Kylo, Spark, Druid, Presto, RESTful services, Lambda / Kappa architectures Responsibilities : - Build a scalable, reliable, operable and performant big data platform for both streaming and batch analytics - Design and implement data aggregation, cleansing and transformation layers Skills : - Around 4+ years of hands-on experience designing and operating large data platforms - Experience in Big data Ingestion, Transformation and stream/batch processing technologies using Apache NiFi, Apache Kafka, Kafka Connect (Confluent), Sqoop, Spark, Storm, Hive etc; - Experience in designing and building streaming data platforms in Lambda, Kappa architectures - Should have working experience in one of NoSQL, OLAP data stores like Druid, Cassandra, Elasticsearch, Pinot etc; - Experience in one of data warehousing tools like RedShift, BigQuery, Azure SQL Data Warehouse - Exposure to other Data Ingestion, Data Lake and querying frameworks like Marmaray, Kylo, Drill, Presto - Experience in designing and consuming microservices - Exposure to security and governance tools like Apache Ranger, Apache Atlas - Any contributions to open source projects a plus - Experience in performance benchmarks will be a plus
Read more
GeakMinds Technologies Pvt Ltd
John Richardson
Posted by John Richardson
Chennai
1 - 5 yrs
₹1L - ₹6L / yr
Hadoop
Big Data
HDFS
Apache Sqoop
Apache Flume
+2 more
• Looking for Big Data Engineer with 3+ years of experience. • Hands-on experience with MapReduce-based platforms, like Pig, Spark, Shark. • Hands-on experience with data pipeline tools like Kafka, Storm, Spark Streaming. • Store and query data with Sqoop, Hive, MySQL, HBase, Cassandra, MongoDB, Drill, Phoenix, and Presto. • Hands-on experience in managing Big Data on a cluster with HDFS and MapReduce. • Handle streaming data in real time with Kafka, Flume, Spark Streaming, Flink, and Storm. • Experience with Azure cloud, Cognitive Services, Databricks is preferred.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort