50+ Apache Kafka Jobs in Bangalore (Bengaluru) | Apache Kafka Job openings in Bangalore (Bengaluru)
Apply to 50+ Apache Kafka Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Apache Kafka Job opportunities across top companies like Google, Amazon & Adobe.

Required Skillset
• Experience in Core Java 1.8 and above, Data Structures, OOPS, Multithreading, Algorithms, Collections, System Design, Unix/Linux. • Possess good architectural knowledge and be aware of enterprise application design patterns. • Should be able to analyze, design, develop and test complex, low-latency client-facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-volume server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills in Java. • Strong interpersonal, communication and analytical skills. • Should be able to express their design ideas and thoughts.
Job Brief-
• Understand product requirements and come up with solution approaches. • Build and enhance large scale domain centric applications. • Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines.
Key Responsibilities:
- Design, develop, and maintain robust and scalable backend applications using Core Java and Spring Boot.
- Build and manage microservices-based architectures and ensure smooth inter-service communication.
- Integrate and manage real-time data streaming using Apache Kafka.
- Write clean, maintainable, and efficient code following best practices.
- Collaborate with cross-functional teams including QA, DevOps, and product management.
- Participate in code reviews and provide constructive feedback.
- Troubleshoot, debug, and optimize applications for performance and scalability.
Required Skills:
- Strong knowledge of Core Java (Java 8 or above).
- Hands-on experience with Spring Boot and the Spring ecosystem (Spring MVC, Spring Data, Spring Security).
- Experience in designing and developing RESTful APIs.
- Solid understanding of Microservices architecture and related patterns.
- Practical experience with Apache Kafka for real-time data processing.
- Familiarity with SQL/NoSQL databases such as MySQL, PostgreSQL, or MongoDB.
- Good understanding of CI/CD tools and practices.
- Knowledge of containerization tools like Docker is a plus.
- Strong problem-solving skills and attention to detail.

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises
Mandatory Skills:
- Java
- Kafka
- Spring Boot
- SQL / MySQL
- Algorithms
- Data Structures
- The candidates must be premier institute
Key Responsibilities:
- Design and Develop large scale sub-systems.
- To periodically explore latest technologies (esp. Open Source) and prototype sub-systems.
- Be a part of the team that develops the next-gen Targeting platform.
- Build components to make the customer data platform more efficient and scalable.
Qualifications:
- 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes.
- Candidates with CGPA 9 or above will be preferred.
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding).
- Good knowledge of Databases - SQL, NoSQL.
- Knowledge of Unit Testing a plus.
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain.
- Excellent written and verbal communication skills.
- Active & contributing team member.
- Strong work ethic with demonstrated ability to meet and exceed commitments.
- Others: Experience of having worked in a start-up is a plus.
Job Title: Tech Lead and SSE – Kafka, Python, and Azure Databricks (Healthcare Data Project)
Experience: 4 to 12 years
Role Overview:
We are looking for a highly skilled Tech Lead with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams.
Key Responsibilities:
- Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks.
- Architect scalable data streaming and processing solutions to support healthcare data workflows.
- Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data.
- Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.).
- Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions.
- Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows.
- Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering.
- Stay updated with the latest cloud technologies, big data frameworks, and industry trends.
Required Skills & Qualifications:
- 4+ years of experience in data engineering, with strong proficiency in Kafka and Python.
- Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing.
- Experience with Azure Databricks (or willingness to learn and adopt it quickly).
- Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus).
- Proficiency in SQL, NoSQL databases, and data modeling for big data processing.
- Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications.
- Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus.
- Strong analytical skills, problem-solving mindset, and ability to lead complex data projects.
- Excellent communication and stakeholder management skills.
Role & Responsibilities
Responsible for ensuring that the architecture and design of the platform remains top-notch with respect to scalability, availability, reliability and maintainability
Act as a key technical contributor as well as a hands-on contributing member of the team.
Own end-to-end availability and performance of features, driving rapid product innovation while ensuring a reliable service.
Working closely with the various stakeholders like Program Managers, Product Managers, Reliability and Continuity Engineering(RCE) team, QE team to estimate and execute features/tasks independently.
Maintain and drive tech backlog execution for non-functional requirements of the platform required to keep the platform resilient
Assist in release planning and prioritization based on technical feasibility and engineering constraints
A zeal to continually find new ways to improve architecture, design and ensure timely delivery and high quality.
WHO WE ARE
We are a team of digital practitioners with roots stretching back to the earliest days of online commerce, who dedicate themselves to serving our client companies.
We’ve seen the advancements first-hand over the last 25 years and believe our experiences allow us to innovate. Utilizing cutting-edge technology and providing bespoke, innovative services, we believe we can help you stay ahead of the curve.
We take a holistic view of digital strategy. Our approach to transformation is based on conscious Intent to delight customers through continuous Insight and creative Innovation with an enduring culture of problem-solving.
We bring every element together to create innovative, high-performing commerce experiences for enterprise B2C, B2B, D2C and Marketplace brands across industries. From mapping out business and functional requirements, to developing the infrastructure to optimize traditionally fragmented processes, we help you create integrated, future-proofed commerce solutions.
WHAT YOU’LL BE DOING
As part of our team, you'll play a key role in building and evolving our Integration Platform as a Service (iPaaS) solution. This platform empowers our clients to seamlessly connect systems, automate workflows, and scale integrations with modern cloud-native tools.
Here’s what your day-to-day will look like:
- Designing and Building Integrations
- Collaborate with clients to understand integration needs and build scalable, event-driven solutions using Apache Kafka, AWS Lambda, API Gateway, and EventBridge.
- Cloud-Native Development
- Develop and deploy microservices and serverless functions using TypeScript (Node.js), hosted on Kubernetes (EKS) and fully integrated with core AWS services like S3, SQS, and SNS.
- Managing Data Pipelines
- Build robust data flows and streaming pipelines using Kafka and NoSQL databases like MongoDB, ensuring high availability and fault tolerance.
- Client Collaboration
- Work directly with customers to gather requirements, design integration patterns, and provide guidance on best practices for cloud-native architectures.
- Driving Platform Evolution
- Contribute to the ongoing improvement of our iPaaS platform—enhancing observability, scaling capabilities, and CI/CD processes using modern DevOps practices.
WHAT WE NEED IN YOU
- Solid Experience in Apache Kafka for data streaming and event-driven systems
- Production experience with Kubernetes (EKS) and containerized deployments
- Deep knowledge of AWS, including S3, EC2, SQS, SNS, EventBridge, Lambda
- Proficient in TypeScript (Node.js environment)
- Experience with MongoDB or other NoSQL databases
- Familiarity with microservices architecture, async messaging, and DevOps practices
- AWS Certification (e.g., Solutions Architect or Developer Associate) is a plus
Qualification
- Graduate - BE / Btech or equivalent.
- 5 to 8 years of experience
- Self motivated and quick learner with excellent problem solving skills.
- A good team player with nice communication skills.
- Energy and real passion to work in a startup environment.
Visit our website - https://www.trikatechnologies.com
We’re looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization.
Responsibilities:
- Lead the design of data warehouses, lakes, and ETL workflows.
- Collaborate with teams to gather requirements and build scalable solutions.
- Ensure data governance, security, and optimal performance of systems.
- Mentor junior engineers and drive end-to-end project delivery.
Requirements:
- 6+ years of experience in data engineering, including at least 2 full-cycle data warehouse projects.
- Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms.
- Expertise in big data tools (e.g., Apache Spark, Kafka).
- Excellent communication skills and leadership abilities.
Preferred: Experience with workflow orchestration tools (e.g., Airflow), real-time data, and DataOps practices.
📍 Position : Java Architect
📅 Experience : 10 to 15 Years
🧑💼 Open Positions : 3+
📍 Work Location : Bangalore, Pune, Chennai
💼 Work Mode : Hybrid
📅 Notice Period : Immediate joiners preferred; up to 1 month maximum
🔧 Core Responsibilities :
- Lead architecture design and development for scalable enterprise-level applications.
- Own and manage all aspects of technical development and delivery.
- Define and enforce best coding practices, architectural guidelines, and development standards.
- Plan and estimate the end-to-end technical scope of projects.
- Conduct code reviews, ensure CI/CD, and implement TDD/BDD methodologies.
- Mentor and lead individual contributors and small development teams.
- Collaborate with cross-functional teams, including DevOps, Product, and QA.
- Engage in high-level and low-level design (HLD/LLD), solutioning, and cloud-native transformations.
🛠️ Required Technical Skills :
- Strong hands-on expertise in Java, Spring Boot, Microservices architecture
- Experience with Kafka or similar messaging/event streaming platforms
- Proficiency in cloud platforms – AWS and Azure (must-have)
- Exposure to frontend technologies (nice-to-have)
- Solid understanding of HLD, system architecture, and design patterns
- Good grasp of DevOps concepts, Docker, Kubernetes, and Infrastructure as Code (IaC)
- Agile/Lean development, Pair Programming, and Continuous Integration practices
- Polyglot mindset is a plus (Scala, Golang, Python, etc.)
🚀 Ideal Candidate Profile :
- Currently working in a product-based environment
- Already functioning as an Architect or Principal Engineer
- Proven track record as an Individual Contributor (IC)
- Strong engineering fundamentals with a passion for scalable software systems
- No compromise on code quality, craftsmanship, and best practices
🧪 Interview Process :
- Round 1: Technical pairing round
- Rounds 2 & 3: Technical rounds with panel (code pairing + architecture)
- Final Round: HR and offer discussion
Role & Responsibilities
About the Role:
We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.
Key responsibilities:
Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.
Implement efficient data modeling techniques to optimize performance and scalability of data systems.
Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.
Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.
Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.
Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.
Design and develop batch pipelines for scheduled data processing tasks.
Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.
Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.
Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises
Mandatory Skills:
- Java
- Kafka
- Spring Boot
- SQL / MySQL
- Algorithms
- Data Structures
Key Responsibilities:
- Design and Develop large scale sub-systems.
- To periodically explore latest technologies (esp. Open Source) and prototype sub-systems.
- Be a part of the team that develops the next-gen Targeting platform.
- Build components to make the customer data platform more efficient and scalable.
Qualifications:
- 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes.
- Candidates with CGPA 9 or above will be preferred.
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding).
- Good knowledge of Databases - SQL, NoSQL.
- Knowledge of Unit Testing a plus.
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain.
- Excellent written and verbal communication skills.
- Active & contributing team member.
- Strong work ethic with demonstrated ability to meet and exceed commitments.
- Others: Experience of having worked in a start-up is a plus.


Role & Responsibilities
Lead the design, development, and deployment of complex, scalable, reliable, and highly available features for world-class SaaS products and services.
Guide the engineering team in adopting best practices for software development, code quality, and architecture.
Make strategic architectural and technical decisions, ensuring the scalability, security, and performance of software applications.
Proactively identify, prioritize, and address technical debt to improve system performance, maintainability, and long-term scalability, ensuring a solid foundation for future development.
Collaborate with cross-functional teams (product managers, designers, and stakeholders) to define project scope, requirements, and timelines.
Mentor and coach team members, providing technical guidance and fostering professional development.
Oversee code reviews, ensuring adherence to best practices and maintaining high code quality standards.
Drive continuous improvement in development processes, tools, and technologies to increase team productivity and product quality.
Stay updated with the latest industry trends and emerging technologies to drive innovation and keep the team at the cutting edge.
Ensure project timelines and goals are met, managing risks and resolving any technical challenges that arise during development.
Foster a collaborative and inclusive team culture, promoting open communication and problem-solving.
Imbibe and maintain a strong customer delight attitude while designing and building products.
What we Require
We are recruiting technical experts with the following core skills and hands-on experience on
Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot
Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes
Expert proficiency in designing both producer and consumer types of Rest services.
Expert proficiency in Unit testing and Code Quality tools.
Expert proficiency in ensuring code coverage.
Expert proficiency in understanding High-Level Design and translating that to Low-Level design
Hands-on experience working with no-SQL databases.
Experience working in an Agile development process - Scrum.
Experience working closely with engineers and software cultures.
Ability to think at a high level about product strategy and customer journeys.
Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.
Excellent communication skills to clearly articulate design decisions.
JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.
Mon-Fri, In office role with excellent perks and benefits!
Key Responsibilities:
1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.
2. Build and implement scalable and robust microservices and integrate API gateways.
3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).
4. Implement real-time data pipelines using Kafka.
5. Collaborate with front-end developers to ensure seamless integration of backend services.
6. Write clean, reusable, and efficient code following best practices, including design patterns.
7. Troubleshoot, debug, and enhance existing systems for improved performance.
Mandatory Skills:
1. Proficiency in at least one backend technology: Node.js or Python, or Java.
2. Strong experience in:
i. Microservices architecture,
ii. API gateways,
iii. NoSQL databases (e.g., MongoDB, DynamoDB),
iv. Kafka
v. Data structures (e.g., arrays, linked lists, trees).
3. Frameworks:
i. If Java : Spring framework for backend development.
ii. If Python: FastAPI/Django frameworks for AI applications.
iii. If Node: Express.js for Node.js development.
Good to Have Skills:
1. Experience with Kubernetes for container orchestration.
2. Familiarity with in-memory databases like Redis or Memcached.
3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.
Architect
Experience - 12+ yrs
About Wekan Enterprise Solutions
Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.
Job Description
We are looking for passionate architects eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?
You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.
Location - Chennai or Bangalore
● Relevant experience of 12+ years building high-performance applications with at least 3+ years as an architect.
● Good problem solving skills
● Strong mentoring capabilities
● Good understanding of software development life cycle
● Strong experience in system design and architecture
● Strong focus on quality of work delivered
● Excellent verbal and written communication skills
Required Technical Skills
● Extensive hands-on experience building high-performance applications using Node.Js (Javascript/Typescript) and .NET/ Golang / Java / Python.
● Strong experience with appropriate framework(s).
● Wellversed in monolithic and microservices architecture.
● Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases
● Experience working with 3rd party integrations ranging from authentication, cloud services, etc.
● Hands-on experience with Kafka or RabbitMQ.
● Handsonexperience with CI/CD pipelines and atleast 1 cloud provider- AWS / GCP / Azure
● Strong experience writing and maintaining clear documentation
Good to have skills:
● Experience working with frontend technologies - React.Js or Vue.Js or Angular.
● Extensive experience consulting with customers directly for defining architecture or system design.
● Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies
We are seeking a skilled Java Developer with 5+ years of experience in Java, Camunda, Apache Camel, Kafka, and Apache Karaf. The ideal candidate should have expertise in workflow automation, message-driven architectures, and enterprise integration patterns. Strong problem-solving skills and hands-on experience in microservices and event-driven systems are required.
Mandatory Skills:
- AZ-104 (Azure Administrator) experience
- CI/CD migration expertise
- Proficiency in Windows deployment and support
- Infrastructure as Code (IaC) in Terraform
- Automation using PowerShell
- Understanding of SDLC for C# applications (build/ship/run strategy)
- Apache Kafka experience
- Azure web app
Good to Have Skills:
- AZ-400 (Azure DevOps Engineer Expert)
- AZ-700 Designing and Implementing Microsoft Azure Networking Solutions
- Apache Pulsar
- Windows containers
- Active Directory and DNS
- SAST and DAST tool understanding
- MSSQL database
- Postgres database
- Azure security
Dear Candidate,
We are urgently Hiring AWS Cloud Engineer for Bangalore Location.
Position: AWS Cloud Engineer
Location: Bangalore
Experience: 8-11 yrs
Skills: Aws Cloud
Salary: Best in Industry (20-25% Hike on the current ctc)
Note:
only Immediate to 15 days Joiners will be preferred.
Candidates from Tier 1 companies will only be shortlisted and selected
Candidates' NP more than 30 days will get rejected while screening.
Offer shoppers will be rejected.
Job description:
Description:
Title: AWS Cloud Engineer
Prefer BLR / HYD – else any location is fine
Work Mode: Hybrid – based on HR rule (currently 1 day per month)
Shift Timings 24 x 7 (Work in shifts on rotational basis)
Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.
Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting
Experience and Skills Requirements:
Experience:
8 years of experience in a technical role working with AWS
Mandatory
Technical troubleshooting and problem solving
AWS management of large-scale IaaS PaaS solutions
Cloud networking and security fundamentals
Experience using containerization in AWS
Working Data warehouse knowledge Redshift and Snowflake preferred
Working with IaC – Terraform and Cloud Formation
Working understanding of scripting languages including Python and Shell
Collaboration and communication skills
Highly adaptable to changes in a technical environment
Optional
Experience using monitoring and observer ability toolsets inc. Splunk, Datadog
Experience using Github Actions
Experience using AWS RDS/SQL based solutions
Experience working with streaming technologies inc. Kafka, Apache Flink
Experience working with a ETL environments
Experience working with a confluent cloud platform
Certifications:
Minimum
AWS Certified SysOps Administrator – Associate
AWS Certified DevOps Engineer - Professional
Preferred
AWS Certified Solutions Architect – Associate
Responsibilities:
Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.
The following is a list of expected responsibilities:
To manage and support a customer’s AWS platform
To be technical hands on
Provide Incident and Problem management on the AWS IaaS and PaaS Platform
Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner
Actively monitor an AWS platform for technical issues
To be involved in the resolution of technical incidents tickets
Assist in the root cause analysis of incidents
Assist with improving efficiency and processes within the team
Examining traces and logs
Working with third party suppliers and AWS to jointly resolve incidents
Good to have:
Confluent Cloud
Snowflake
Best Regards,
Minakshi Soni
Executive - Talent Acquisition (L2)
Rigel Networks
Worldwide Locations: USA | HK | IN
- Strong knowledge in Kafka development and architecture.
- Hands-on experience on KSQL Database.
- Very good communication, analytical & problem-solving. skills.
- Proven hands-on Development experience Kafka platforms, lenses, confluent.
- Strong knowledge of the framework (Kafka Connect).
- Very comfortable with Shell scripting & Linux commands.
- Experience in DB2 database
Job description:
- Hands on skills in Java programming languages
- Experience of testing of Cloud Native applications with exposure of Kafka.
- Understanding the concepts of K8, Caching, Rest / GRPC and Observability
- Experience with good programming or scripting practices and tools: code review, ADO/Jenkin etc
- Apply expertise in Java, API Testing, Cucumber or other test frameworks to design, develop and maintain automation test suites.
- Intimate familiarity with QA concepts: white-/black-/grey-box testing, acceptance/regression test, system integration test, performance/stress test, and security tests
As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.
- Develop and recommend improvements to standard and moderately complex application support processes and procedures.
- Review, analyze and prioritize incoming incident tickets and user requests.
- Perform programming, configuration, testing and deployment of fixes or updates for application version releases.
- Implement security processes to protect data integrity and ensure regulatory compliance.
- Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs.
MINIMUM QUALIFICATIONS
- 2-4 year of minimum experience
- Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
- Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
- Experience implementing Kerberos security
- Preferred:
- Experience in Linux system administration
- Authentication plugin experience such as basic, SSL, and Kerberos
- Production incident support including root cause analysis
- AWS EC2
- Terraform
Radisys Corporation is looking for JAVA Backend developers with 6-10 years of experience for their Bangalore location.
The ideal candidate will be able to design and develop code for tasks after brainstorming sessions and applying best practices and coding conventions.
This position requires experience in Java, Spring, Spring Boot, microservices, message broker, and DB knowledge. Candidates should be skilled in developing enterprise applications that consist of FE, BE, and DB integration.
If you have experience with Docker and Kubernetes, that's an added advantage.
Radisys Corporation, a global leader in open telecom solutions, enables service providers to drive disruption with new open architecture business models. Our innovative technology solutions leverage open reference architectures and standards, combined with open software and hardware, to power business transformation for the telecom industry. Our services organization delivers systems integration expertise necessary to solve complex deployment challenges for communications and content providers.
Job Overview :
We are looking for a Lead Engineer - Java with a strong background in Java development and hands-on experience with J2EE, Springboot, Kubernetes, Microservices, NoSQL, and SQL. As a Lead Engineer, you will be responsible for designing and developing high-quality software solutions and ensuring the successful delivery of projects. role with 7 to 10 years of experience, based in Bangalore, Karnataka, India. This position is a full-time role with excellent growth opportunities.
Qualifications and Skills :
- Bachelor's or master's degree in Computer Science or a related field
- Strong knowledge of Core Java, J2EE, and Springboot frameworks
- Hands-on experience with Kubernetes and microservices architecture
- Experience with NoSQL and SQL databases
- Proficient in troubleshooting and debugging complex system issues
- Experience in Enterprise Applications
- Excellent communication and leadership skills
- Ability to work in a fast-paced and collaborative environment
- Strong problem-solving and analytical skills
Roles and Responsibilities :
- Work closely with product management and cross-functional teams to define requirements and deliverables
- Design scalable and high-performance applications using Java, J2EE, and Springboot
- Develop and maintain microservices using Kubernetes and containerization
- Design and implement data models using NoSQL and SQL databases
- Ensure the quality and performance of software through code reviews and testing
- Collaborate with stakeholders to identify and resolve technical issues
- Stay up-to-date with the latest industry trends and technologies
Egen is a data engineering and cloud modernization firm helping industry-leading companies achieve digital breakthroughs and deliver for the future, today. We are catalysts for change who create digital breakthroughs at warp speed. Our team of cloud and data engineering experts are trusted by top clients in pursuit of the extraordinary. An Inc. 5000 Fastest Growing Company 7 times, and recently recognized on the Crain’s Chicago Business Fast 50 list, Egen has also been recognized as a great place to work 3 times.
You will join a team of insatiably curious data engineers, software architects, and product experts who never settle for "good enough". Our Java Platform team's tech stack is based on Java8 (Spring Boot) and RESTful web services. We typically build and deploy applications as cloud-native Kubernetes microservices and integrate with scalable technologies such as Kafka in Docker container environments. Our developers work in an agile process to efficiently deliver high value data driven applications and product packages.
Required Experience:
- Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
- Have experience working and strong understanding of object-oriented programing and cloud technologies
- End to end experience delivering production ready code with Java8, Spring Boot, Spring Data, and API libraries
- Strong experience with unit and integration testing of the Spring Boot APIs.
- Strong understanding and production experience of RESTful API's and microservice architecture.
- Strong understanding of SQL databases and NoSQL databases and experience with writing abstraction layers to communicate with the databases.
Nice to have's (but not required):
- Exposure to Kotlin or other JVM programming languages
- Strong understanding and production experience working with Docker container environments
- Strong understanding and production experience working with Kafka
- Cloud Environments: AWS, GCP or Azure


Job Description: Full Stack Developer Company: Arroz Technology Private Limited CTC: 5 LPA
Location : Bangalore (Onsite)
Responsibilities:
- Design and develop scalable and high-performance web applications using the
MERN (MongoDB, Express.js, React.js, Node.js) stack.
- Collaborate with cross-functional teams to gather requirements and translate them into high-level designs.
- Write clean, reusable, and well-structured code following industry best practices and coding standards.
- Mentor and guide junior developers, providing technical expertise and promoting Professional growth.
- Conduct code reviews and provide constructive feedback to ensure code quality and adherence to standards.
- Collaborate with frontend and backend developers to integrate components and ensure smooth data flow.
- Work with UI/UX designers to implement responsive and user-friendly interfaces.
- Stay updated with the latest trends and advancements in full-stack development technologies.
- Work in a 10 AM to 6 PM, six-day office role, maintaining regular attendance and punctuality.
Required Skills and Qualifications:
-Strong proficiency in MERN (MongoDB, Express.js, React.js, Node.js) stack development.
-Experience with Redux or similar state management libraries.
-Solid understanding of front-end technologies such as HTML, CSS, and JavaScript.
-Proficiency in RESTful API development and integration.
-Familiarity with version control systems like Git and agile development methodologies.
-Good problem-solving and debugging skills.
-Excellent communication and teamwork abilities.
-Bachelor's degree in Computer Science or a related field (preferred).
Join Arroz Technology Private Limited as a Full Stack Developer and contribute to the development of cutting-edge web applications. This role offers competitive compensation and growth opportunities within a dynamic work environment.
Job Description:
Organization - Prolifics Corporation
Skill - Java developer
Job type - Full time/Permanent
Location - Bangalore/Mumbai
Experience - 5 to 10 Years
Notice Period – Immediate to 30 Days
Required Skillset:
Spring framework concepts, Spring boot(Mandatory)
Spring batch and dashboard
Apache Kafka(Mandatory)
Azure (Mandatory)
GIT / Maven / Griddle / CI/CD
MS SQL database
Cloud and Data Exposure
Docker, Orchestration using Kubernetes
Genesys pure cloud or any cloud-based contact center platform that can be used to manage customer interactions.
Technical Experience:
The candidate should have 5+ years of experience, preferably at technology or financial firm.
Must have at least 2- 3 years of experience in spring batch / java / Kafka / SQL
Must have hands on experience in database tools and technologies.
Must have exposure to CI / CD and Cloud.
Work scope
Build the spring batch framework to pull the required data from Genesys
Cloud to MS reporting data storage – on prem / Cloud.
Build MS WM Contact Center Data Hub (on Prem / Cloud)
Build dashboard to monitor and manage the data injection, fusion jobs.
Event bridge implementation for real time data ingestion and monitoring
MS Private Cloud
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Roles and Responsibilities:
- Proven experience in Java 8, Spring Boot, Microservices and API
- Strong experience with Kafka, Kubernetes
- strong experience in using RDBMS (Mysql) and NoSQL.
- Experience working in Eclipse or Maven environments
- Hands-on experience in Unix and Shell scripting
- hands-on experience in fine-tuning application response and performance testing.
- experience in Web Services.
- Strong analysis and problem-solving skills
- Strong communication skills, both verbal and written
- Ability to work independently with limited supervision
- Proven ability to use own initiative to resolve issues
- Full ownership of projects and tasks
- Ability and willingness to work under pressure, on multiple concurrent tasks, and to deliver to agreed deadlines
- Eagerness to learn
- Strong team-working skills


EXPERTISE AND QUALIFICATIONS
- 14+ years of experience in Software Engineering with at least 6+ years as a Lead Enterprise Architect preferably in a software product company
- High technical credibility - ability to lead technical brainstorming, take decisions and push for the best solution to a problem
- Experience in architecting Microservices based E2E Enterprise Applications
- Experience in UI technologies such as Angular, Node.js or Fullstack technology is desirable
- Experience with NoSQL technologies (MongoDB, Neo4j etc.)
- Elastic Search, Kibana, ELK, Logstash.
- Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
- Exposure in SaaS cloud-based platform.
- Experience on Docker, Kubernetes etc.
- Experience in planning, designing, developing and delivering Enterprise Software using Agile Methodology
- Key Programming Skills: Java, J2EE with cutting edge technologies
- Hands-on technical leadership with proven ability to recruit and mentor high performance talents including Architects, Technical Leads, Developers
- Excellent team building, mentoring and coaching skills are a must-have
- A proven track record of consistently setting and achieving high standards
Five Reasons Why You Should Join Zycus
1. Cloud Product Company: We are a Cloud SaaS Company, and our products are created by using the latest technologies like ML and AI. Our UI is in Angular JS and we are developing our mobile apps using React.
2. A Market Leader: Zycus is recognized by Gartner (world’s leading market research analyst) as a Leader in Procurement Software Suites.
3. Move between Roles: We believe that change leads to growth and therefore we allow our employees to shift careers and move to different roles and functions within the organization
4. Get a Global Exposure: You get to work and deal with our global customers.
5. Create an Impact: Zycus gives you the environment to create an impact on the product and transform your ideas into reality. Even our junior engineers get the opportunity to work on different product features.
About Us
Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users.
Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-
to-use user interface ensures high adoption and value across the organization.
Start your #CognitiveProcurement journey with us, as you are #MeantforMore
- Role: IoT Application Development (Java) Skill Set:
- Proficiency in Java 11.
- Strong knowledge of Spring Boot framework.
- Experience with Kubernetes.
- Familiarity with Kafka.
- Understanding of Azure Cloud services.
1 Experience: 3 to 5 years Location: Bangalore ; Notice period : Immediate Joiners
- Job Description: We are seeking an experienced IoT Application Developer with expertise in Java to join our team in Bangalore. As a Java Developer, you will be responsible for designing, developing, and deploying IoT applications. You should have a solid understanding of Java 11 and the Spring Boot framework. Experience with Kubernetes and Kafka is also required. Familiarity with Azure Cloud services is essential. Your role will involve collaborating with the development team to build scalable and efficient IoT solutions using Java and related technologies.


About Merchandise Operation (Merch Ops): Merchandise Operations (Merch Ops) is a merchandise management system, it is positioned as a host system in the retail solutions, it has ability to maintain the Master/Foundation data, create and manage Purchase Orders, create, and manage Prices & Promotions, perform Replenishment, effective inventory control and financial management. Merc Ops provides Business users with consistent, accurate, and timely data across an enterprise by allowing them to get the:
Right Goods in the...
• Right Silhouettes, Sizes and Colors; at the...
• Right Price; at the...
• Right Location; for the...
• Right Consumer; at the...
• Right Time; at the...
• Right Quantity.
About Team:
• Proven, passionate bunch of disruptors providing solutions that solve real-time supply chain problems.
• Well mixed experienced team with young members and experienced in product, domain, and Industry knowledge.
• Gained Expertise in designing and deploying massively scalable cloud native SaaS products
• The team currently comprises of associates across the globe and is expected to grow rapidly.
Our current technical environment:
• Software: React JS, Node JS, Oracle PL/SQL, GIT, Rest API. Java script.
• Application Architecture: Scalable three tier web application.
• Cloud Architecture: Private cloud, MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD) • Frameworks/Others: Tomcat Apache, RDBMS, Jenkins, Nginx, Oracle Type ORM, Express.
What you'll be doing:
• As a Staff Engineer you will be responsible for the design of the features in the product roadmap
• Creating and encouraging good software development practices engineering-wide, driving strategic technical improvements, and mentoring other engineers.
• You will write code as we expect our technical leadership to be in the trenches alongside junior engineers, understanding root causes and leading by example
• You will mentor engineers
• You will own relationships with other engineering teams and collaborate with other functions within Blue Yonder
• Drive architecture and designs to become simpler, more robust, and more efficient.
• Lead designs discussion and come up with robust and more efficient designs to achieve features in product roadmap
• Take complete responsibility of the features developed right from coding till deployments
• Introduce new technology and tools for the betterment of the product
• Guides fellow engineers to look beyond the surface and fix the root causes rather than symptoms.
What we are looking for:
• Bachelor’s degree (B.E/B.Tech/M.Tech Computer science or related specialization) and minimum 7 to 10 years of experience in Software development, has been an Architect, within the last 1-2 years minimum. • Strong programming experience and background in Node JS and React JS.
• Hands-on development skills along with architecture/design experience.
• Hands-on experience on designing, building deploying and maintenance of enterprise cloud solutions.
• Demonstrable experience, thorough knowledge, and interests in Cloud native architecture, Distributed micro-services, Multi-tenant SaaS solution and Cloud Scalability, performance, and High availability
• Experience with API management platforms & providing / consuming RESTful APIs
• Experience with varied tools such as REST, Hibernate, RDBMS, Docker, Kubernetes, Kafka, React.
• Hands-on development experience on Oracle PL/SQL.
• Experience with DevOps and infrastructure automation.
- 2.5+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 3+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite
Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
What you get to do in this role:
Work on extremely high scale RUST web services or backend systems.
Design and develop solutions for highly scalable web and backend systems.
Proactively identify and solve performance issues.
Maintain a high bar on code quality and unit testing.
What you bring to the role:
5+ years of hands-on software development experience.
At least 2+ years of RUST development experience.
Knowledge of cargo packages for kafka, redis etc.
Strong CS fundamentals, including system design, data structures and algorithms.
Expertise in backend and web services development.
Good analytical and troubleshooting skills.
What will help you stand out:
Experience working with large scale web services and applications.
Exposure to Golang, Scala or Java
Exposure to Big data systems like Kafka, Spark, Hadoop etc.
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
As Conviva is expanding, we are building products providing deep insights into end-user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real-time. Engineer the next-gen Spark-like system for in-memory computation of large time-series datasets – both Spark-like backend infra and library-based programming model. Build a horizontally and vertically scalable system that analyses trillions of events per day within sub-second latencies. Utilize the latest and greatest big data technologies to build solutions for use cases across multiple verticals. Lead technology innovation and advancement that will have a big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva’s products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements, etc.
- Lead a team to develop a feature or parts of a product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 5+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their business ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include: DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
Required Skills:
- 3+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 4+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.

Have you streamed a program on Disney+, watched your favorite binge-worthy series on Peacock or cheered your favorite team on during the World Cup from one of the 20 top streaming platforms around the globe? If the answer is yes, you’ve already benefitted from Conviva technology, helping the world’s leading streaming publishers deliver exceptional streaming experiences and grow their businesses.
Conviva is the only global streaming analytics platform for big data that collects, standardizes, and puts trillions of cross-screen, streaming data points in context, in real time. The Conviva platform provides comprehensive, continuous, census-level measurement through real-time, server side sessionization at unprecedented scale. If this sounds important, it is! We measure a global footprint of more than 500 million unique viewers in 180 countries watching 220 billion streams per year across 3 billion applications streaming on devices. With Conviva, customers get a unique level of actionability and scale from continuous streaming measurement insights and benchmarking across every stream, every screen, every second.
As Conviva is expanding, we are building products providing deep insights into end user experience for our customers.
Platform and TLB Team
The vision for the TLB team is to build data processing software that works on terabytes of streaming data in real time. Engineer the next-gen Spark-like system for in-memory computation of large time-series dataset’s – both Spark-like backend infra and library based programming model. Build horizontally and vertically scalable system that analyses trillions of events per day within sub second latencies. Utilize the latest and greatest of big data technologies to build solutions for use-cases across multiple verticals. Lead technology innovation and advancement that will have big business impact for years to come. Be part of a worldwide team building software using the latest technologies and the best of software development tools and processes.
What You’ll Do
This is an individual contributor position. Expectations will be on the below lines:
- Design, build and maintain the stream processing, and time-series analysis system which is at the heart of Conviva's products
- Responsible for the architecture of the Conviva platform
- Build features, enhancements, new services, and bug fixing in Scala and Java on a Jenkins-based pipeline to be deployed as Docker containers on Kubernetes
- Own the entire lifecycle of your microservice including early specs, design, technology choice, development, unit-testing, integration-testing, documentation, deployment, troubleshooting, enhancements etc.
- Lead a team to develop a feature or parts of the product
- Adhere to the Agile model of software development to plan, estimate, and ship per business priority
What you need to succeed
- 9+ years of work experience in software development of data processing products.
- Engineering degree in software or equivalent from a premier institute.
- Excellent knowledge of fundamentals of Computer Science like algorithms and data structures. Hands-on with functional programming and know-how of its concepts
- Excellent programming and debugging skills on the JVM. Proficient in writing code in Scala/Java/Rust/Haskell/Erlang that is reliable, maintainable, secure, and performant
- Experience with big data technologies like Spark, Flink, Kafka, Druid, HDFS, etc.
- Deep understanding of distributed systems concepts and scalability challenges including multi-threading, concurrency, sharding, partitioning, etc.
- Experience/knowledge of Akka/Lagom framework and/or stream processing technologies like RxJava or Project Reactor will be a big plus. Knowledge of design patterns like event-streaming, CQRS and DDD to build large microservice architectures will be a big plus
- Excellent communication skills. Willingness to work under pressure. Hunger to learn and succeed. Comfortable with ambiguity. Comfortable with complexity
Underpinning the Conviva platform is a rich history of innovation. More than 60 patents represent award-winning technologies and standards, including first-of-its kind-innovations like time-state analytics and AI-automated data modeling, that surfaces actionable insights. By understanding real-world human experiences and having the ability to act within seconds of observation, our customers can solve business-critical issues and focus on growing their businesses ahead of the competition. Examples of the brands Conviva has helped fuel streaming growth for include DAZN, Disney+, HBO, Hulu, NBCUniversal, Paramount+, Peacock, Sky, Sling TV, Univision, and Warner Bros Discovery.
Privately held, Conviva is headquartered in Silicon Valley, California with offices and people around the globe. For more information, visit us at www.conviva.com. Join us to help extend our leadership position in big data streaming analytics to new audiences and markets!
Job Title: Data Engineer
Job Summary: As a Data Engineer, you will be responsible for designing, building, and maintaining the infrastructure and tools necessary for data collection, storage, processing, and analysis. You will work closely with data scientists and analysts to ensure that data is available, accessible, and in a format that can be easily consumed for business insights.
Responsibilities:
- Design, build, and maintain data pipelines to collect, store, and process data from various sources.
- Create and manage data warehousing and data lake solutions.
- Develop and maintain data processing and data integration tools.
- Collaborate with data scientists and analysts to design and implement data models and algorithms for data analysis.
- Optimize and scale existing data infrastructure to ensure it meets the needs of the business.
- Ensure data quality and integrity across all data sources.
- Develop and implement best practices for data governance, security, and privacy.
- Monitor data pipeline performance / Errors and troubleshoot issues as needed.
- Stay up-to-date with emerging data technologies and best practices.
Requirements:
Bachelor's degree in Computer Science, Information Systems, or a related field.
Experience with ETL tools like Matillion,SSIS,Informatica
Experience with SQL and relational databases such as SQL server, MySQL, PostgreSQL, or Oracle.
Experience in writing complex SQL queries
Strong programming skills in languages such as Python, Java, or Scala.
Experience with data modeling, data warehousing, and data integration.
Strong problem-solving skills and ability to work independently.
Excellent communication and collaboration skills.
Familiarity with big data technologies such as Hadoop, Spark, or Kafka.
Familiarity with data warehouse/Data lake technologies like Snowflake or Databricks
Familiarity with cloud computing platforms such as AWS, Azure, or GCP.
Familiarity with Reporting tools
Teamwork/ growth contribution
- Helping the team in taking the Interviews and identifying right candidates
- Adhering to timelines
- Intime status communication and upfront communication of any risks
- Tech, train, share knowledge with peers.
- Good Communication skills
- Proven abilities to take initiative and be innovative
- Analytical mind with a problem-solving aptitude
Good to have :
Master's degree in Computer Science, Information Systems, or a related field.
Experience with NoSQL databases such as MongoDB or Cassandra.
Familiarity with data visualization and business intelligence tools such as Tableau or Power BI.
Knowledge of machine learning and statistical modeling techniques.
If you are passionate about data and want to work with a dynamic team of data scientists and analysts, we encourage you to apply for this position.
- 3+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 4+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
Required Education:
B.Tech./ BE - Computer, IT, Electronics only
Required Skills:
- 2+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 4+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus




CTC Budget: 35-55LPA
Location: Hyderabad (Remote after 3 months WFO)
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
- 6 plus years of experience as a Python developer.
- Experience in web development using Python and Django Framework.
- Experience in Data Analysis and Data Science using Pandas, Numpy and Scifi-Kit - (GTH)
- Experience in developing User Interface using HTML, JavaScript, CSS.
- Experience in server-side templating languages including Jinja 2 and Mako
- Knowledge into Kafka and RabitMQ (GTH)
- Experience into Docker, Git and AWS
- Ability to integrate multiple data sources into a single system.
- Ability to collaborate on projects and work independently when required.
- DB (MySQL, Postgress, SQL)
Selection Process: 2-3 Interview rounds (Tech, VP, Client)
Kapture CRM is an enterprise-focused Service automation SaaS platform. We help 500+ enterprises in 14 countries to manage their customer service in a more intelligent, contextual way.
Roles & Responsibilities :
* Proven experience in Java8, Spring Boot, Microservices/API
* Strong experience with Kafka, Kubernetes
* Strong experience in using RDBMS (Mysql) and NoSQL.
* Experience in working in Eclipse / Maven environments.
* Hands-on experience in Unix / Shell scripting.
* Hands-on experience in fine-tuning application response/performance testing.
* Experience in Web Services.
* Strong analysis & problem-solving skills
* Strong communication skills - both verbal and written
* Ability to work independently with limited supervision
* Proven ability to use own initiative to resolve issues
* Full ownership of projects/tasks
* Ability and willingness to work under pressure, on multiple concurrent tasks, and to deliver to agreed deadlines
* Eagerness to learn
* Strong team-working skills



CTC Budget: 35-50LPA
Location: Hyderabad/Bangalore
Experience: 8+ Years
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
Work with, learn from, and contribute to a diverse, collaborative
development team
● Use plenty of PHP, Go, JavaScript, MySQL, PostgreSQL, ElasticSearch,
Redshift, AWS Services and other technologies
● Build efficient and reusable abstractions and systems
● Create robust cloud-based systems used by students globally at scale
● Experiment with cutting edge technologies and contribute to the
company’s product roadmap
● Deliver data at scale to bring value to clients Requirements
You will need:
● Experience working with a server side language in a full-stack environment
● Experience with various database technologies (relational, nosql,
document-oriented, etc) and query concepts in high performance
environments
● Experience in one of these areas: React, Backbone
● Understanding of ETL concepts and processes
● Great knowledge of design patterns and back end architecture best
practices
● Sound knowledge of Front End basics like JavaScript, HTML, CSS
● Experience with TDD, automated testing
● 12+ years’ experience as a developer
● Experience with Git or Mercurial
● Fluent written & spoken English
It would be great if you have:
● B.Sc or M.Sc degree in Software Engineering, Computer Science or similar
● Experience and/or interest in API Design
● Experience with Symfony and/or Doctrine
● Experience with Go and Microservices
● Experience with message queues e.g. SQS, Kafka, Kinesis, RabbitMQ
● Experience working with a modern Big Data stack
● Contributed to open source projects
● Experience working in an Agile environment
Job Summary:
We are looking for a skilled and experienced Java Developer to join our team. As a Java Developer, you will be responsible for developing and maintaining our applications using Java, Spring framework, and other related technologies. The ideal candidate should have a strong understanding of object-oriented programming principles, as well as experience with a variety of technologies such as SQL, NoSQL, and cloud computing.
Responsibilities:
- Design, develop, and maintain our applications using Java, Spring framework, and other related technologies
- Write clean, efficient, and optimized code for applications
- Collaborate with cross-functional teams to understand user requirements and deliver high-quality solutions
- Develop and maintain backend systems using Spring framework
- Work with databases, including SQL and NoSQL
- Ensure code quality and maintain documentation
- Troubleshoot and debug applications
- Stay updated with emerging trends and technologies in Java development
- Work with other teams to deploy and maintain applications
Requirements:
- 3-7 years of experience in Java development
- Strong understanding of object-oriented programming principles
- Experience with Java, Spring framework, and related technologies
- Familiarity with databases, including SQL and NoSQL
- Knowledge of cloud computing is a plus
- Excellent problem-solving and debugging skills
- Strong communication and collaboration skills
- Ability to work independently and as part of a team
- Bachelor's degree in computer science or a related field
Key Skills:
- Strong proficiency in Java programming language
- Experience with Spring framework, including Spring Boot and Spring MVC
- Familiarity with cloud platforms such as AWS, GCP, and Azure
- Experience building RESTful APIs
- Knowledge of microservices architecture
- Familiarity with SQL and relational databases such as MySQL and Postgres
- Familiarity with NoSQL databases such as MongoDB and Redis
- Experience with messaging systems such as Kafka and RabbitMQ
- Experience with containerization tools such as Docker and Kubernetes
- Understanding of software development principles and experience with SDLC methodologies
- Experience with Git version control and build tools such as Maven and Gradle
- Familiarity with front-end technologies such as Angular and React is a plus
- Strong problem-solving and analytical skills
- Good communication and interpersonal skills
- Ability to work independently and take ownership of tasks
- Experience with test-driven development and unit testing frameworks such as JUnit and Mockito
- Familiarity with CI/CD tools such as Jenkins is a plus
- Familiarity with caching technologies such as Redis is a plus
- Working knowledge of design patterns and software architecture principles is a plus.
- 2+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 4+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
Minimum 4 to 10 years of experience in testing distributed backend software architectures/systems.
• 4+ years of work experience in test planning and automation of enterprise software
• Expertise in programming using Java or Python and other scripting languages.
• Experience with one or more public clouds is expected.
• Comfortable with build processes, CI processes, and managing QA Environments as well as working with build management tools like Git, and Jenkins
. • Experience with performance and scalability testing tools.
• Good working knowledge of relational databases, logging, and monitoring frameworks is expected.
Familiarity with system flow like how they interact with an application Eg. Elasticsearch, Mongo, Kafka, Hive, Redis, AWS


Essential Responsibilities:
- Demonstrate an understanding of the Agile software development life cycle and distinguish the core inputs and outputs in each cycle.
- Work closely with your peers and keep engaging in a fast pace technical design and development team
- Execute in a fast pace delivery mode and focus on delivering tasks to meet monthly and quarterly digital product release goals
- Lead impact assessment and decisions related to technology choices, design / architectural considerations and implementation strategy
- Maintain code quality through best practices, unit testing and code quality automation
- Demonstrate the ability to make informed technology choices after due diligence and impact assessment
- Help in designing interfaces and information exchange between modules
- Articulate the need for scalability and understand the importance of improving quality through testing.
- Be an expert in writing code that meets standards and delivers the desired functionality using the technology selected for the project
- Drive design reviews, define interfaces between code modules, and apply existing technology to designs
- Be an expert in assessing application performance and optimizing/improving it through design and best coding practices
Qualifications/Requirements:
- Minimum Bachelor's Degree in Computer Science, Computer Engineering or in "STEM" Majors (Science, Technology, Engineering, and Math)
- 6+ years of experience in Full Stack Software Development within the enterprise or software services domain
Desired Skills:
- Expertise in full stack software development and awareness of 12 Factor software patterns
- Experience and knowledge of patterns and anti-patterns of microservices-based architecture design
- Experience developing and deploying applications on cloud (Azure, AWS, or GCP), on-premise, and hybrid-based architectures
- Mid-Level to Expert within one or more of the following UI development JavaScript: Client-Side HTML5 jQuery, jQuery UI, Knockout.js,
- Polymer, AngularJS, ReactJS, Bootstrap
- Mid-Level to Expert within one or more of the back-end development languages: .NET, Java, Python, or Scala
- Very solid API skills (e.g. Express.js/Node.js, GraphQL/Relay, Flask, Jersey, Java Spring REST or WebApi2)
- Skilled in use of Java, Kafka, and Spark streaming technologies
- Experience with containerization technologies such as Rancher, Kubernetes, Docker and Helm
- Hands-on experience in data storage environments of many types (RDMS, NoSQL, HDFS, etc.)
- Knowledge of GitLab, Jenkins and Artifactory
- Solid foundation in data structures, algorithms, and OO Design with rock-solid programming skills
- Security: Identity Management and Access, application security and static code analysis
- Proven success working in and promoting a rapidly changing, collaborative, and iterative product development environment
- Strong interpersonal skills, analytical skills, combined with intellectual curiosity, and a desire and ability to "get things done" are essential
- Agile Scrum development experience
- Added advantage to those having experience in multi-tenant SaaS Platform and Developers' Portal development
We are looking for a Director of Engineering to lead one of our key product engineering teams. This role will report directly to the VP of Engineering and will be responsible for successful execution of the company's business mission through development of cutting-edge software products and solutions.
- As an owner of the product you will be required to plan and execute the product road map and provide technical leadership to the engineering team.
- You will have to collaborate with Product Management and Implementation teams and build a commercially successful product.
- You will be responsible to recruit & lead a team of highly skilled software engineers and provide strong hands on engineering leadership.
- Requirement deep technical knowledge in Software Product Engineering using Amazon Web Services,Java 8 Java/J2EE, Node.js, React.js, fullstack, NosqlDB, mongodb, cassandra, neo4j, elastic search, kibana, elk, kafka, redis, docker, kubernetes, Amazon Web Services ,Architecture Concepts,Design PatternsData Structures & Algorithms,Distributed Computing,Multi-threading,AWS,Docker,Kubernetes, apache, solr, activemq, rabbitmq, spark, scala, sqoop, hbase, hive, websocket, webcrawler, springboot, etc. is a must.
- 16+ years of experience in Software Engineering with at least 5+ years as an engineering leader in a software product company.
- Hands-on technical leadership with proven ability to recruit high performance talent
- High technical credibility - ability to audit technical decisions and push for the best solution to a problem.
- Experience building E2E Application right from backend database to persistent layer.
- Experience UI technologies Angular, react.js, Node.js or fullstack environment will be preferred.
- Experience with NoSQL technologies (MongoDB, Cassandra, Neo4j, Dynamodb, etc.)
- Elastic Search, Kibana, ELK, Logstash.
- Experience in developing Enterprise Software using Agile Methodology.
- Good understanding of Kafka, Redis, ActiveMQ, RabbitMQ, Solr etc.
- SaaS cloud-based platform exposure.
- Experience on Docker, Kubernetes etc.
- Ownership E2E design development and also quality enterprise product/application deliverable exposure
- A track record of setting and achieving high standards
- Strong understanding of modern technology architecture
- Key Programming Skills: Java, J2EE with cutting edge technologies
- Excellent team building, mentoring and coaching skills are a must-have
Requirements
- 2+ years of experience in the Development of JAVA technology.
- Strong Java Basics
- Linux
- SpringBoot or Spring MVC
- Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Java 8
- Any Caching Mechanism
- Good at problem-solving
Good to Have Skills:
- 2+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem-solving skills.
- Ability to work in a fast-paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding AI/ML algorithms is a plus.
Requirements
- 3+ years of experience in the Development of JAVA technology.
- Strong Java Basics
- Linux
- SpringBoot or Spring MVC
- Hands-on experience in Relational Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Java 8
- Any Caching Mechanism
- Good at problem-solving
Good to Have Skills:
- 3 years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem-solving skills.
- Ability to work in a fast-paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding AI/ML algorithms is a plus.
About Telstra
Telstra is Australia’s leading telecommunications and technology company, with operations in more than 20 countries, including In India where we’re building a new Innovation and Capability Centre (ICC) in Bangalore.
We’re growing, fast, and for you that means many exciting opportunities to develop your career at Telstra. Join us on this exciting journey, and together, we’ll reimagine the future.
Why Telstra?
- We're an iconic Australian company with a rich heritage that's been built over 100 years. Telstra is Australia's leading Telecommunications and Technology Company. We've been operating internationally for more than 70 years.
- International presence spanning over 20 countries.
- We are one of the 20 largest telecommunications providers globally
- At Telstra, the work is complex and stimulating, but with that comes a great sense of achievement. We are shaping the tomorrow's modes of communication with our innovation driven teams.
Telstra offers an opportunity to make a difference to lives of millions of people by providing the choice of flexibility in work and a rewarding career that you will be proud of!
About the team
Being part of Networks & IT means you'll be part of a team that focuses on extending our network superiority to enable the continued execution of our digital strategy.
With us, you'll be working with world-leading technology and change the way we do IT to ensure business needs drive priorities, accelerating our digitisation programme.
Focus of the role
Any new engineer who comes into data chapter would be mostly into developing reusable data processing and storage frameworks that can be used across data platform.
About you
To be successful in the role, you'll bring skills and experience in:-
Essential
- Hands-on experience in Spark Core, Spark SQL, SQL/Hive/Impala, Git/SVN/Any other VCS and Data warehousing
- Skilled in the Hadoop Ecosystem(HDP/Cloudera/MapR/EMR etc)
- Azure data factory/Airflow/control-M/Luigi
- PL/SQL
- Exposure to NOSQL(Hbase/Cassandra/GraphDB(Neo4J)/MongoDB)
- File formats (Parquet/ORC/AVRO/Delta/Hudi etc.)
- Kafka/Kinesis/Eventhub
Highly Desirable
Experience and knowledgeable on the following:
- Spark Streaming
- Cloud exposure (Azure/AWS/GCP)
- Azure data offerings - ADF, ADLS2, Azure Databricks, Azure Synapse, Eventhubs, CosmosDB etc.
- Presto/Athena
- Azure DevOps
- Jenkins/ Bamboo/Any similar build tools
- Power BI
- Prior experience in building or working in team building reusable frameworks,
- Data modelling.
- Data Architecture and design principles. (Delta/Kappa/Lambda architecture)
- Exposure to CI/CD
- Code Quality - Static and Dynamic code scans
- Agile SDLC
If you've got a passion to innovate, succeed as part of a great team, and looking for the next step in your career, we'd welcome you to apply!
___________________________
We’re committed to building a diverse and inclusive workforce in all its forms. We encourage applicants from diverse gender, cultural and linguistic backgrounds and applicants who may be living with a disability. We also offer flexibility in all our roles, to ensure everyone can participate.
To learn more about how we support our people, including accessibility adjustments we can provide you through the recruitment process, visit tel.st/thrive.