2+ Apache Kafka Jobs in Coimbatore | Apache Kafka Job openings in Coimbatore
Apply to 2+ Apache Kafka Jobs in Coimbatore on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

The candidate should have extensive experience in designing and developing scalable data pipelines and real-time data processing solutions. As a key member of the team, the Senior Data Engineer will play a critical role in building end-to-end data workflows, supporting machine learning model deployment, and driving MLOps practices in a fast-paced, agile environment. Strong expertise in Apache Kafka, Apache Flink, AWS SageMaker, and Terraform is essential. Additional experience with infrastructure automation and CI/CD for ML models is a significant advantage.
Key Responsibilities
- Design, develop, and maintain high-performance ETL and real-time data pipelines using Apache Kafka and Apache Flink.
- Build scalable and automated MLOps pipelines for training, validation, and deployment of models using AWS SageMaker and associated services.
- Implement and manage Infrastructure as Code (IaC) using Terraform to provision and manage AWS environments.
- Collaborate with data scientists, ML engineers, and DevOps teams to streamline model deployment workflows and ensure reliable production delivery.
- Optimize data storage and retrieval strategies for large-scale structured and unstructured datasets.
- Develop data transformation logic and integrate data from various internal and external sources into data lakes and warehouses.
- Monitor, troubleshoot, and enhance performance of data systems in a cloud-native, fast-evolving production setup.
- Ensure adherence to data governance, privacy, and security standards across all data handling activities.
- Document data engineering solutions and workflows to facilitate cross-functional understanding and ongoing maintenance.
WHO WE ARE
We are a team of digital practitioners with roots stretching back to the earliest days of online commerce, who dedicate themselves to serving our client companies.
We’ve seen the advancements first-hand over the last 25 years and believe our experiences allow us to innovate. Utilizing cutting-edge technology and providing bespoke, innovative services, we believe we can help you stay ahead of the curve.
We take a holistic view of digital strategy. Our approach to transformation is based on conscious Intent to delight customers through continuous Insight and creative Innovation with an enduring culture of problem-solving.
We bring every element together to create innovative, high-performing commerce experiences for enterprise B2C, B2B, D2C and Marketplace brands across industries. From mapping out business and functional requirements, to developing the infrastructure to optimize traditionally fragmented processes, we help you create integrated, future-proofed commerce solutions.
WHAT YOU’LL BE DOING
As part of our team, you'll play a key role in building and evolving our Integration Platform as a Service (iPaaS) solution. This platform empowers our clients to seamlessly connect systems, automate workflows, and scale integrations with modern cloud-native tools.
Here’s what your day-to-day will look like:
- Designing and Building Integrations
- Collaborate with clients to understand integration needs and build scalable, event-driven solutions using Apache Kafka, AWS Lambda, API Gateway, and EventBridge.
- Cloud-Native Development
- Develop and deploy microservices and serverless functions using TypeScript (Node.js), hosted on Kubernetes (EKS) and fully integrated with core AWS services like S3, SQS, and SNS.
- Managing Data Pipelines
- Build robust data flows and streaming pipelines using Kafka and NoSQL databases like MongoDB, ensuring high availability and fault tolerance.
- Client Collaboration
- Work directly with customers to gather requirements, design integration patterns, and provide guidance on best practices for cloud-native architectures.
- Driving Platform Evolution
- Contribute to the ongoing improvement of our iPaaS platform—enhancing observability, scaling capabilities, and CI/CD processes using modern DevOps practices.
WHAT WE NEED IN YOU
- Solid Experience in Apache Kafka for data streaming and event-driven systems
- Production experience with Kubernetes (EKS) and containerized deployments
- Deep knowledge of AWS, including S3, EC2, SQS, SNS, EventBridge, Lambda
- Proficient in TypeScript (Node.js environment)
- Experience with MongoDB or other NoSQL databases
- Familiarity with microservices architecture, async messaging, and DevOps practices
- AWS Certification (e.g., Solutions Architect or Developer Associate) is a plus
Qualification
- Graduate - BE / Btech or equivalent.
- 5 to 8 years of experience
- Self motivated and quick learner with excellent problem solving skills.
- A good team player with nice communication skills.
- Energy and real passion to work in a startup environment.
Visit our website - https://www.trikatechnologies.com