17+ Apache Kafka Jobs in Chennai | Apache Kafka Job openings in Chennai
Apply to 17+ Apache Kafka Jobs in Chennai on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.
Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.
You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.
Required Experience:
- Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
- Have experience working and strong understanding of object-oriented programing and cloud technologies
- End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
- Strong experience with unit and integration testing of the Spring Boot APIs.
- Strong understanding and production experience of RESTful API's and microservice architecture.
- Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.
Nice to have's (but not required):
- Exposure to Kotlin or other JVM programming languages
- Strong understanding and production experience working with Docker container environments
- Strong understanding and production experience working with Kafka
- Cloud Environments: AWS, GCP or Azure
Title: Platform Engineer Location: Chennai Work Mode: Hybrid (Remote and Chennai Office) Experience: 4+ years Budget: 16 - 18 LPA
Responsibilities:
- Parse data using Python, create dashboards in Tableau.
- Utilize Jenkins for Airflow pipeline creation and CI/CD maintenance.
- Migrate Datastage jobs to Snowflake, optimize performance.
- Work with HDFS, Hive, Kafka, and basic Spark.
- Develop Python scripts for data parsing, quality checks, and visualization.
- Conduct unit testing and web application testing.
- Implement Apache Airflow and handle production migration.
- Apply data warehousing techniques for data cleansing and dimension modeling.
Requirements:
- 4+ years of experience as a Platform Engineer.
- Strong Python skills, knowledge of Tableau.
- Experience with Jenkins, Snowflake, HDFS, Hive, and Kafka.
- Proficient in Unix Shell Scripting and SQL.
- Familiarity with ETL tools like DataStage and DMExpress.
- Understanding of Apache Airflow.
- Strong problem-solving and communication skills.
Note: Only candidates willing to work in Chennai and available for immediate joining will be considered. Budget for this position is 16 - 18 LPA.
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Relevant Experience: 5+ Years
Location: PAN India
Client: IBM
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
- data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Our client is in the field of IT servicing and IT consulting
- Analyzes, designs, develops, codes and implements programs in one or more programming languages, for Web and Rich Internet Applications.
- Supports applications with an understanding of system integration, test planning, scripting, and troubleshooting.
- Assesses the health and performance of software applications and databases.
- Establishes, participates, and maintains relationships with business units, customers and subject matter experts in order to remain apprised of direction, project status, architectural and technology trends, risks, and functional/integration issues.
- Defines specifications and develop programs, modifies existing programs, prepares test data, and prepares functional specifications.
- Analyzes program and application performance using various programming languages, tools and techniques.
- Provides guidance to non-technical staff in using software and hardware systems most effectively and efficiently.
- Reviews project proposals, evaluates alternatives, provides estimates and makes recommendations.
- Designs and defines specifications for systems.
- Identifies potential process improvement areas and suggests options and recommends approaches
- Knowledgeable in software development and design pattern
- Swagger, Rabbit MQ, Kafka
- Good API skills technology such as Rest web service and Spring based technology
- Good knowledge on Container based application configurations and deployment preferred env. is OpenShift
- Experience on creating unit test using Junit
- Experience on markup language such as JSON and YML
- Experience on using quality and security scan tools such as Sonar, Fortify
- Experience on Agile methodology
- 7 -10 Years of experience in software development.
· 10+ years of Information Technology experience, preferably with Telecom / wireless service providers. · Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation
· To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc · The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers · Demonstrated ability to work collaboratively · Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.) · Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision · Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems |
at NSEIT
BDD
Competence Requirement:
1. 3+ years of experience in developing backend Java applications.
2. Experience with Java 11 will be GOOD to have.
3. Experience in front-end development is desired.
4. A self-driven attitude along with a sense of structure and creativeness.
5. Excellent written and spoken English.
6. Bachelor degree in computer science, information technology or software engineering or equivalent
7. Hands-on knowledge and experience of developing financial systems and understanding of financial concepts.
Responsibilities:
1. Write high quality code that solves difficult problems in a highly distributed system with extreme demands on resilience and quality.
2. Perform sufficient tests to ensure at least 80% code coverage.
3. Participate in and contribute to scrum ceremonies, e.g. daily stand-ups, sprint planning, demos and retros.
4. Will be involved in several stages of the product life cycle; design, implementation and testing. At times, also release and deployment.
5. Participate in design discussions and decisions.
Good to have skills:
1. Primary skills – Java 8, spring boot, React, MQ/Messaging services & API, (Java 11, ReactiveX, REST, Swagger/OpenAPI, React/Redux, Gradle, Git, BitBucket, Jenkins)
2. High performance transactional platform
3. Back-end development and Middleware
4. Modern UI based on React
5. Continuous delivery and automation
6. Domain – Capital Market, Investment Banking is good or BFSI is ok
We are looking for passionate, talented and super-smart engineers to join our product development team. If you are someone who innovates, loves solving hard problems, and enjoys end-to-end product development, then this job is for you! You will be working with some of the best developers in the industry in a self-organising, agile environment where talent is valued over job title or years of experience.
Responsibilities:
- You will be involved in end-to-end development of VIMANA technology, adhering to our development practices and expected quality standards.
- You will be part of a highly collaborative Agile team which passionately follows SAFe Agile practices, including pair-programming, PR reviews, TDD, and Continuous Integration/Delivery (CI/CD).
- You will be working with cutting-edge technologies and tools for stream processing using Java, NodeJS and Python, using frameworks like Spring, RxJS etc.
- You will be leveraging big data technologies like Kafka, Elasticsearch and Spark, processing more than 10 Billion events per day to build a maintainable system at scale.
- You will be building Domain Driven APIs as part of a micro-service architecture.
- You will be part of a DevOps culture where you will get to work with production systems, including operations, deployment, and maintenance.
- You will have an opportunity to continuously grow and build your capabilities, learning new technologies, languages, and platforms.
Requirements:
- Undergraduate degree in Computer Science or a related field, or equivalent practical experience.
- 2 to 5 years of product development experience.
- Experience building applications using Java, NodeJS, or Python.
- Deep knowledge in Object-Oriented Design Principles, Data Structures, Dependency Management, and Algorithms.
- Working knowledge of message queuing, stream processing, and highly scalable Big Data technologies.
- Experience in working with Agile software methodologies (XP, Scrum, Kanban), TDD and Continuous Integration (CI/CD).
- Experience using no-SQL databases like MongoDB or Elasticsearch.
- Prior experience with container orchestrators like Kubernetes is a plus.
We build products and platforms for the Industrial Internet of Things. Our technology is being used around the world in mission-critical applications - from improving the performance of manufacturing plants, to making electric vehicles safer and more efficient, to making industrial equipment smarter.
Please visit https://govimana.com/ to learn more about what we do.
Why Explore a Career at VIMANA
- We recognize that our dedicated team members make us successful and we offer competitive salaries.
- We are a workplace that values work-life balance, provides flexible working hours, and full time remote work options.
- You will be part of a team that is highly motivated to learn and work on cutting edge technologies, tools, and development practices.
- Bon Appetit! Enjoy catered breakfasts, lunches and free snacks!
VIMANA Interview Process
We usually target to complete all the interviews in a week's time and would provide prompt feedback to the candidate. As of now, all the interviews are conducted online due to covid situation.
1.Telephonic screening (30 Min )
A 30 minute telephonic interview to understand and evaluate the candidate's fit with the job role and the company.
Clarify any queries regarding the job/company.
Give an overview about further interview rounds
2. Technical Rounds
This would be deep technical round to evaluate the candidate's technical capability pertaining to the job role.
3. HR Round
Candidate's team and cultural fit will be evaluated during this round
We would proceed with releasing the offer if the candidate clears all the above rounds.
Note: In certain cases, we might schedule additional rounds if needed before releasing the offer.
An IT Services Major, hiring for a leading insurance player.
Client An IT Services Major, hiring for a leading insurance player.
Position: SENIOR CONSULTANT
Job Description:
- Azure admin- senior consultant with HD Insights(Big data)
Skills and Experience
- Microsoft Azure Administrator certification
- Bigdata project experience in Azure HDInsight Stack. big data processing frameworks such as Spark, Hadoop, Hive, Kafka or Hbase.
- Preferred: Insurance or BFSI domain experience
- 5 to 5 years of experience is required.
- Extensive experience in Javascript / NodeJS in the back end
- Front end frameworks such as Bootstrap, Pug, Jquery
- Experience in web frameworks like ExpressJS, Webpack
- Experience in Nginx, Redis, Apache Kafka and MQTT
- Experience with MongoDB
- Experience with Version Control Systems like Git / Mercurial
- Sound knowledge in Software engineering best practices
- Sound knowledge in RestFul API Design
- Working knowledge of Automated testing tools
- Experience in maintaining production servers (Optional)
- Experience with Azure DevOps (Optional)
- Experience in digital payments or financial services industry is a plus.
- Participation in the processes of strategic project-planning meetings.
- Be involved and participate in the overall application lifecycle.
- Collaborate with External Development Teams.
- Define and communicate technical and design requirements, understanding workflows and write code as per requirements.
- Develop functional and sustainable web applications with clean codes.
- Focus on coding and debugging.
Online ENT Healthcare giant in India
Software Development Engineer:
Major Responsibilities:
- Translation of complex functional requirements into technical requirements, implementing and maintaining a coherent and progressive development strategy for our product line
- Design, develop and maintain complex systems using best of the breed development practices and technology.
- Responsible for the over-all software development life cycle.
- Delivery of High Quality, Scalable and Extensible systems and applications on-time and on-budget.
- Adoption and Evolution of the software engineering practices and tools within the organization
- Keep in sync with the latest technology developments and open source offerings. Evaluate and adopt them for solving business problem of organization.
- Collaborate with other technology and business teams within the organization to provide efficient robust solutions to the problems.
- Drive and manage the bug triage process
- Report on status of product delivery and quality to management, customer support and product teams.
Desired Skills
- Strong programming, debugging, and problem-solving skills
- Strong understanding of data structures and algorithms
- Sound understanding of object-oriented programming and excellent software design skills.
- Good experience of SOA/Microservices/Restful services and development of N-tier J2EE / JavaSpringboot applications (API’s).
- Strong understanding of database design and SQL (mySql/mariaDB) development
- Good to have knowledge of NoSQL technologies like MongoDB, Solr, Redis, Cassandra or any other NoSQL database
- Knowledge of design patterns and good to have experience of large-scale applications
- Should have experience in Apache Kafka, RabbitMQ or other Queueing systems.
Ideal Experience
- 3 to 8 years of industry experience.
- Bachelors or Master’s Degree in Computer Science/ IT
- Drive discussions to create/improve product, process and technology
- Provide end to end solution and design details
- Lead development of formalized solution methodologies
- Passion to work in startup like environment
Personal Characteristics
- Passion and commitment
- Strong and excellent software design intellect
- High integrity
- Self-starter
Role Summary/Purpose:
We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.
Requirements:
- The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
- Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
- Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
- Excellent knowledge in SQL & Linux Shell scripting
- Bachelors/Master’s/Engineering Degree from a well-reputed university.
- Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
- Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
- Ability to manage a diverse and challenging stakeholder community
- Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.
Responsibilities
- Should works as a senior developer/individual contributor based on situations
- Should be part of SCRUM discussions and to take requirements
- Adhere to SCRUM timeline and deliver accordingly
- Participate in a team environment for the design, development and implementation
- Should take L3 activities on need basis
- Prepare Unit/SIT/UAT testcase and log the results
- Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
- Quality delivery and automation should be a top priority
- Co-ordinate change and deployment in time
- Should create healthy harmony within the team
- Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Requires a bachelor's degree in area of specialty and experience in the field or in a related area. Familiar with standard concepts, practices, and procedures within a particular field. Relies on experience and judgment to plan and accomplish goals. Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager.
Designs, develops, and implements web-based Java applications to support business requirements. Follows approved life cycle methodologies, creates design documents, and performs program coding and testing. Resolves technical issues through debugging, research, and investigation.
Additional Job Details:
Strong in Java, Spring, Spring Boot, REST and developing MicroServices.
Knowledge or experience , Cassandra preferred
Knowledge or experience on Kafka
Good to have but not must
Good to know:
Reporting tools like Splunk/Grafana
Protobuf
Python/Ruby