Cutshort logo
Apache Kafka Jobs in Chennai

17+ Apache Kafka Jobs in Chennai | Apache Kafka Job openings in Chennai

Apply to 17+ Apache Kafka Jobs in Chennai on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

icon
Egen Solutions
Anshul Saxena
Posted by Anshul Saxena
Remote, Hyderabad, Ahmedabad, Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Kolkata, Indore, Bhopal, Kochi (Cochin), Chennai, Bengaluru (Bangalore), Pune
3 - 5 yrs
Best in industry
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
skill iconKotlin
+3 more

Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.


You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.


Required Experience:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • Have experience working and strong understanding of object-oriented programing and cloud technologies
  • End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
  • Strong experience with unit and integration testing of the Spring Boot APIs.
  • Strong understanding and production experience of RESTful API's and microservice architecture.
  • Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.

Nice to have's (but not required):

  • Exposure to Kotlin or other JVM programming languages
  • Strong understanding and production experience working with Docker container environments
  • Strong understanding and production experience working with Kafka
  • Cloud Environments: AWS, GCP or Azure


Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Chennai
4 - 7 yrs
₹13L - ₹15L / yr
skill iconData Analytics
Data Visualization
PowerBI
Tableau
Qlikview
+10 more

Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA

Responsibilities:

  • Parse data using Python, create dashboards in Tableau.
  • Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
  • Migrate Datastage jobs to Snowflake, optimize performance.
  • Work with HDFS, Hive, Kafka, and basic Spark.
  • Develop Python scripts for data parsing, quality checks, and visualization.
  • Conduct unit testing and web application testing.
  • Implement Apache Airflow and handle production migration.
  • Apply data warehousing techniques for data cleansing and dimension modeling.

Requirements:

  • 4+ years of experience as a Platform Engineer.
  • Strong Python skills, knowledge of Tableau.
  • Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
  • Proficient in Unix Shell Scripting and SQL.
  • Familiarity with ETL tools like DataStage and DMExpress.
  • Understanding of Apache Airflow.
  • Strong problem-solving and communication skills.

Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.

Read more
Bengaluru (Bangalore), Hyderabad, Pune, Chennai, Jaipur
10 - 14 yrs
₹1L - ₹15L / yr
Ant
Maven
CI/CD
skill iconJenkins
skill iconGitHub
+16 more

DevOps Architect 

Experience:  10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.

Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired

Technical Skillset: Skills Proficiency level

  • Build tools (Ant or Maven) - Expert
  • CI/CD tool (Jenkins or Github CI/CD) - Expert
  • Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
  • Infrastructure As Code (Terraform, Helm charts etc.) - Expert
  • Containerization (Docker, Docker Registry) - Expert
  • Scripting (linux) - Expert
  • Cluster deployment (Kubernetes) & maintenance - Expert
  • Programming (Java) - Intermediate
  • Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
  • Artifactory (JFrog) - Expert
  • Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
  • Ansible, MySQL, PostgreSQL - Intermediate


• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)

Roles and Responsibilities

• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Bengaluru (Bangalore), Chennai, Delhi, Mumbai
5 - 10 yrs
₹1L - ₹10L / yr
Apache Kafka
skill iconSpring Boot
Microservices
skill iconKubernetes
Kafka
Job Position: KAFKA Developer
Relevant Experience: 5+ Years
Payroll Company:  Codersbrain Technology Pvt. Ltd.
Location: 
PAN India
Notice Period: Immediate to 15 Days.
Client: IBM
 
Description:
Total Years of Experience: 5+ yrs Relevant Years of Experience: 5+ yrs Mandatory Skills for screening (Limit to top 5 and include version): KAFKA Good to have (Not Mandatory): Detailed Job Description:   Kafka Developer should have - 4 to 5 years of development experience using Confluent Kafka 4 to 5 years of experience in developing microservices using Springboot and Kafka Should have strong experience in developing CI/CD for Spring boot applications and deploying in the Kubernetes environment.  Experience in Kubernetes is MUST (edited)  Experience in using MQ, Oracle source, and Sink Connectors Experience in Kafka performance testing Nice to have experience in OpenShift Nice to have Kafka troubleshooting skills.
Read more
DFCS Technologies
Agency job
via dfcs Technologies by SheikDawood Ali
Remote, Chennai, Anywhere India
1 - 5 yrs
₹9L - ₹14L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
    • data tools: Hadoop, Spark, Kafka, etc.
    • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
    • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
    • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Chennai
3 - 7 yrs
₹10L - ₹15L / yr
API
JSON
Apache Kafka
Agile/Scrum
Sonar
+1 more
Role and Responsibilities
  • Analyzes, designs, develops, codes and implements programs in one or more programming languages, for Web and Rich Internet Applications.
  • Supports applications with an understanding of system integration, test planning, scripting, and troubleshooting.
  • Assesses the health and performance of software applications and databases.
  • Establishes, participates, and maintains relationships with business units, customers and subject matter experts in order to remain apprised of direction, project status, architectural and technology trends, risks, and functional/integration issues.
  • Defines specifications and develop programs, modifies existing programs, prepares test data, and prepares functional specifications.
  • Analyzes program and application performance using various programming languages, tools and techniques.
  • Provides guidance to non-technical staff in using software and hardware systems most effectively and efficiently.
  • Reviews project proposals, evaluates alternatives, provides estimates and makes recommendations.
  • Designs and defines specifications for systems.
  • Identifies potential process improvement areas and suggests options and recommends approaches
Candidate Profile
  •         Knowledgeable in software development and design pattern
  •          Swagger, Rabbit MQ, Kafka 
  •          Good API skills technology such as Rest web service and Spring based technology
  •          Good knowledge on Container based application  configurations and deployment preferred env. is OpenShift
  •           Experience on creating unit test using Junit
  •           Experience on markup language such as JSON and YML
  •           Experience on using quality and security scan tools such as Sonar, Fortify
  •           Experience on Agile methodology
  •           7 -10 Years of experience in software development.
Location: Chennai 
 

 
Read more
Chennai
5 - 13 yrs
₹9L - ₹28L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more
  • Demonstrable experience owning and developing big data solutions, using Hadoop, Hive/Hbase, Spark, Databricks, ETL/ELT for 5+ years

·       10+ years of Information Technology experience, preferably with Telecom / wireless service providers.

·       Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation

  • DataOps mindset: allowing the architecture of a system to evolve continuously over time, while simultaneously supporting the needs of current users
  • Create and maintain Architectural Runway, and Non-Functional Requirements.
  • Design for Continuous Delivery Pipeline (CI/CD data pipeline) and enables Built-in Quality & Security from the start.

·       To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc

·       The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers

·       Demonstrated ability to work collaboratively

·       Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.)

·       Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision

·       Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems

Read more
NSEIT
Mumbai, Chennai
3 - 6 yrs
₹10L - ₹24L / yr
skill iconReact.js
skill iconJava
skill iconJavascript
Fullstack Developer
IBM WebSphere MQ
+9 more
FSD Software Developer

BDD

Competence Requirement:

1. 3+ years of experience in developing backend Java applications.

2. Experience with Java 11 will be GOOD to have.

3. Experience in front-end development is desired.

4. A self-driven attitude along with a sense of structure and creativeness.

5. Excellent written and spoken English.

6. Bachelor degree in computer science, information technology or software engineering or equivalent

7. Hands-on knowledge and experience of developing financial systems and understanding of financial concepts.



Responsibilities:

1. Write high quality code that solves difficult problems in a highly distributed system with extreme demands on resilience and quality.

2. Perform sufficient tests to ensure at least 80% code coverage.

3. Participate in and contribute to scrum ceremonies, e.g. daily stand-ups, sprint planning, demos and retros.

4. Will be involved in several stages of the product life cycle; design, implementation and testing. At times, also release and deployment.

5. Participate in design discussions and decisions.



Good to have skills:

1. Primary skills – Java 8, spring boot, React, MQ/Messaging services & API, (Java 11, ReactiveX, REST, Swagger/OpenAPI, React/Redux, Gradle, Git, BitBucket, Jenkins)

2. High performance transactional platform

3. Back-end development and Middleware

4. Modern UI based on React

5. Continuous delivery and automation

6. Domain – Capital Market, Investment Banking is good or BFSI is ok
Read more
VIMANA

at VIMANA

4 recruiters
Loshy Chandran
Posted by Loshy Chandran
Remote, Chennai
2 - 5 yrs
₹10L - ₹20L / yr
Data engineering
Data Engineer
Apache Kafka
Big Data
skill iconJava
+4 more

We are looking for passionate, talented and super-smart engineers to join our product development team. If you are someone who innovates, loves solving hard problems, and enjoys end-to-end product development, then this job is for you! You will be working with some of the best developers in the industry in a self-organising, agile environment where talent is valued over job title or years of experience.

 

Responsibilities:

  • You will be involved in end-to-end development of VIMANA technology, adhering to our development practices and expected quality standards.
  • You will be part of a highly collaborative Agile team which passionately follows SAFe Agile practices, including pair-programming, PR reviews, TDD, and Continuous Integration/Delivery (CI/CD).
  • You will be working with cutting-edge technologies and tools for stream processing using Java, NodeJS and Python, using frameworks like Spring, RxJS etc.
  • You will be leveraging big data technologies like Kafka, Elasticsearch and Spark, processing more than 10 Billion events per day to build a maintainable system at scale.
  • You will be building Domain Driven APIs as part of a micro-service architecture.
  • You will be part of a DevOps culture where you will get to work with production systems, including operations, deployment, and maintenance.
  • You will have an opportunity to continuously grow and build your capabilities, learning new technologies, languages, and platforms.

 

Requirements:

  • Undergraduate degree in Computer Science or a related field, or equivalent practical experience.
  • 2 to 5 years of product development experience.
  • Experience building applications using Java, NodeJS, or Python.
  • Deep knowledge in Object-Oriented Design Principles, Data Structures, Dependency Management, and Algorithms.
  • Working knowledge of message queuing, stream processing, and highly scalable Big Data technologies.
  • Experience in working with Agile software methodologies (XP, Scrum, Kanban), TDD and Continuous Integration (CI/CD).
  • Experience using no-SQL databases like MongoDB or Elasticsearch.
  • Prior experience with container orchestrators like Kubernetes is a plus.
About VIMANA

We build products and platforms for the Industrial Internet of Things. Our technology is being used around the world in mission-critical applications - from improving the performance of manufacturing plants, to making electric vehicles safer and more efficient, to making industrial equipment smarter.

Please visit https://govimana.com/ to learn more about what we do.

Why Explore a Career at VIMANA
  • We recognize that our dedicated team members make us successful and we offer competitive salaries.
  • We are a workplace that values work-life balance, provides flexible working hours, and full time remote work options.
  • You will be part of a team that is highly motivated to learn and work on cutting edge technologies, tools, and development practices.
  • Bon Appetit! Enjoy catered breakfasts, lunches and free snacks!

VIMANA Interview Process
We usually target to complete all the interviews in a week's time and would provide prompt feedback to the candidate. As of now, all the interviews are conducted online due to covid situation.

1.Telephonic screening (30 Min )

A 30 minute telephonic interview to understand and evaluate the candidate's fit with the job role and the company.
Clarify any queries regarding the job/company.
Give an overview about further interview rounds

2. Technical Rounds

This would be deep technical round to evaluate the candidate's technical capability pertaining to the job role.

3. HR Round

Candidate's team and cultural fit will be evaluated during this round

We would proceed with releasing the offer if the candidate clears all the above rounds.

Note: In certain cases, we might schedule additional rounds if needed before releasing the offer.
Read more
Chennai
3 - 5 yrs
₹5L - ₹10L / yr
Big Data
Hadoop
Apache Kafka
Apache Hive
Microsoft Windows Azure
+1 more

Client  An IT Services Major, hiring for a leading insurance player.

 

 

Position: SENIOR CONSULTANT

 

Job Description:

 

  • Azure admin- senior consultant with HD Insights(Big data)

 

Skills and Experience

 

  • Microsoft Azure Administrator certification
  • Bigdata project experience in Azure HDInsight Stack. big data processing frameworks such as Spark, Hadoop, Hive, Kafka or Hbase.
  • Preferred: Insurance or BFSI domain experience
  • 5 to 5 years of experience is required.
Read more
digital india payments limited
Bhavani Pendyala
Posted by Bhavani Pendyala
Chennai, Hyderabad
3 - 7 yrs
₹4L - ₹12L / yr
skill iconJava
skill iconNodeJS (Node.js)
Fullstack Developer
skill iconReact.js
skill iconRedux/Flux
+13 more
Technology Requirements:
  1. Extensive experience in Javascript / NodeJS in the back end
  2. Front end frameworks such as Bootstrap, Pug, Jquery
  3. Experience in web frameworks like ExpressJS, Webpack
  4. Experience in Nginx, Redis, Apache Kafka and MQTT
  5. Experience with MongoDB
  6. Experience with Version Control Systems like Git / Mercurial
  7. Sound knowledge in Software engineering best practices
  8. Sound knowledge in RestFul API Design
  9. Working knowledge of Automated testing tools
  10. Experience in maintaining production servers (Optional)
  11. Experience with Azure DevOps (Optional)
Soft Skills:
  1. Experience in digital payments or financial services industry is a plus.
  2. Participation in the processes of strategic project-planning meetings.
  3. Be involved and participate in the overall application lifecycle.
  4. Collaborate with External Development Teams.
  5. Define and communicate technical and design requirements, understanding workflows and write code as per requirements.
  6. Develop functional and sustainable web applications with clean codes.
  7. Focus on coding and debugging.
Read more
Remote, Bengaluru (Bangalore), Chennai, Hyderabad, Mumbai, Pune
3 - 8 yrs
₹5L - ₹17L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
MySQL
java
+1 more

Software Development Engineer:

Major Responsibilities:

  • Translation of complex functional requirements into technical requirements, implementing and maintaining a coherent and progressive development strategy for our product line
  • Design, develop and maintain complex systems using best of the breed development practices and technology.
  • Responsible for the over-all software development life cycle.
  • Delivery of High Quality, Scalable and Extensible systems and applications on-time and on-budget.
  • Adoption and Evolution of the software engineering practices and tools within the organization
  • Keep in sync with the latest technology developments and open source offerings. Evaluate and adopt them for solving business problem of organization.
  • Collaborate with other technology and business teams within the organization to provide efficient robust solutions to the problems.
  • Drive and manage the bug triage process
  • Report on status of product delivery and quality to management, customer support and product teams.

Desired Skills

  • Strong programming, debugging, and problem-solving skills
  • Strong understanding of data structures and algorithms
  • Sound understanding of object-oriented programming and excellent software design skills.
  • Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / JavaSpringboot applications (API’s).
  • Strong understanding of database design and SQL (mySql/mariaDB) development
  • Good to have knowledge of NoSQL technologies like MongoDB, Solr, Redis, Cassandra or any other NoSQL database
  • Knowledge of design patterns and good to have experience of large-scale applications
  • Should have experience in Apache Kafka, RabbitMQ or other Queueing systems.

Ideal Experience

  • 3 to 8 years of industry experience.
  • Bachelors or Master’s Degree in Computer Science/ IT
  • Drive discussions to create/improve product, process and technology
  • Provide end to end solution and design details
  • Lead development of formalized solution methodologies
  • Passion to work in startup like environment

Personal Characteristics

  • Passion and commitment
  • Strong and excellent software design intellect
  • High integrity
  • Self-starter
Read more
Maveric Systems

at Maveric Systems

3 recruiters
Rashmi Poovaiah
Posted by Rashmi Poovaiah
Bengaluru (Bangalore), Chennai, Pune
4 - 10 yrs
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
Retail Marketing
Chennai
4 - 9 yrs
₹1L - ₹12L / yr
skill iconJava
Data Structures
Algorithms
skill iconC++
Apache Kafka
+10 more

Requires a bachelor's degree in area of specialty and experience in the field or in a related area. Familiar with standard concepts, practices, and procedures within a particular field. Relies on experience and judgment to plan and accomplish goals. Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager.

Designs, develops, and implements web-based Java applications to support business requirements. Follows approved life cycle methodologies, creates design documents, and performs program coding and testing. Resolves technical issues through debugging, research, and investigation.

 

Additional Job Details:

Strong in Java, Spring, Spring Boot, REST and developing MicroServices.

Knowledge or experience , Cassandra preferred

Knowledge or experience on Kafka

Good to have but not must

 

Good to know:

Reporting tools like Splunk/Grafana

Protobuf

Python/Ruby

Read more
Lymbyc

at Lymbyc

1 video
2 recruiters
Venky Thiriveedhi
Posted by Venky Thiriveedhi
Bengaluru (Bangalore), Chennai
3 - 5 yrs
₹6L - ₹8L / yr
Microservices
skill iconJava
Apache Kafka
- 3+ years of experience in building complex, highly scalable, high volume, low latency Enterprise applications using languages such as Java, NodeJS, Go and/or Scala - Strong experience in building microservices using technologies like Spring Boot, Spring Cloud, Netflix OSS, Zuul - Deep understanding on microservices design patterns, service registry and discovery, externalization of configurations - Experience in message streaming and processing technologies such as Kafka, Spark, Storm, gRPC or other equivalent technologies - Experience with one or more reactive microservice tools and techniques such as Akka, Vert.x, ReactiveX - Strong experience in creation, management and consumption of REST APIs leveraging Swagger, Postman, API Gateways (such as MuleSoft, Apigee) etc; - Strong knowledge in data modelling, querying, performance tuning of any big-data stores (MongoDB, Elasticsearch, Redis etc;) and /or any RDBMS (Oracle, PostgreSQL, MySQL etc;) - Experience working with Agile / Scrum based teams that utilizes Continuous Integration/Continuous Delivery processes using Git, Maven, Jenkins etc; - Experience in Containers (Docker/Kubernetes) based deployment and management - Experience in using AWS/GCP/Azure based cloud infrastructure - Knowledge in test Driven Development and test automation skills with Junit/TestNG - Knowledge in security frameworks, concepts and technologies like Spring Security, OAuth2, SAML, SSO, Identity and Access Management
Read more
Lymbyc

at Lymbyc

1 video
2 recruiters
Venky Thiriveedhi
Posted by Venky Thiriveedhi
Bengaluru (Bangalore), Chennai
4 - 8 yrs
₹9L - ₹14L / yr
Apache Spark
Apache Kafka
Druid Database
Big Data
Apache Sqoop
+5 more
Key skill set : Apache NiFi, Kafka Connect (Confluent), Sqoop, Kylo, Spark, Druid, Presto, RESTful services, Lambda / Kappa architectures Responsibilities : - Build a scalable, reliable, operable and performant big data platform for both streaming and batch analytics - Design and implement data aggregation, cleansing and transformation layers Skills : - Around 4+ years of hands-on experience designing and operating large data platforms - Experience in Big data Ingestion, Transformation and stream/batch processing technologies using Apache NiFi, Apache Kafka, Kafka Connect (Confluent), Sqoop, Spark, Storm, Hive etc; - Experience in designing and building streaming data platforms in Lambda, Kappa architectures - Should have working experience in one of NoSQL, OLAP data stores like Druid, Cassandra, Elasticsearch, Pinot etc; - Experience in one of data warehousing tools like RedShift, BigQuery, Azure SQL Data Warehouse - Exposure to other Data Ingestion, Data Lake and querying frameworks like Marmaray, Kylo, Drill, Presto - Experience in designing and consuming microservices - Exposure to security and governance tools like Apache Ranger, Apache Atlas - Any contributions to open source projects a plus - Experience in performance benchmarks will be a plus
Read more
GeakMinds Technologies Pvt Ltd
John Richardson
Posted by John Richardson
Chennai
1 - 5 yrs
₹1L - ₹6L / yr
Hadoop
Big Data
HDFS
Apache Sqoop
Apache Flume
+2 more
• Looking for Big Data Engineer with 3+ years of experience. • Hands-on experience with MapReduce-based platforms, like Pig, Spark, Shark. • Hands-on experience with data pipeline tools like Kafka, Storm, Spark Streaming. • Store and query data with Sqoop, Hive, MySQL, HBase, Cassandra, MongoDB, Drill, Phoenix, and Presto. • Hands-on experience in managing Big Data on a cluster with HDFS and MapReduce. • Handle streaming data in real time with Kafka, Flume, Spark Streaming, Flink, and Storm. • Experience with Azure cloud, Cognitive Services, Databricks is preferred.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort