50+ Apache Kafka Jobs in India
Apply to 50+ Apache Kafka Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Kafka Jobs and apply today!


Senior Data Engineer Job Description
Overview
The Senior Data Engineer will design, develop, and maintain scalable data pipelines and
infrastructure to support data-driven decision-making and advanced analytics. This role requires deep
expertise in data engineering, strong problem-solving skills, and the ability to collaborate with
cross-functional teams to deliver robust data solutions.
Key Responsibilities
Data Pipeline Development: Design, build, and optimize scalable, secure, and reliable data
pipelines to ingest, process, and transform large volumes of structured and unstructured data.
Data Architecture: Architect and maintain data storage solutions, including data lakes, data
warehouses, and databases, ensuring performance, scalability, and cost-efficiency.
Data Integration: Integrate data from diverse sources, including APIs, third-party systems,
and streaming platforms, ensuring data quality and consistency.
Performance Optimization: Monitor and optimize data systems for performance, scalability,
and cost, implementing best practices for partitioning, indexing, and caching.
Collaboration: Work closely with data scientists, analysts, and software engineers to
understand data needs and deliver solutions that enable advanced analytics, machine
learning, and reporting.
Data Governance: Implement data governance policies, ensuring compliance with data
security, privacy regulations (e.g., GDPR, CCPA), and internal standards.
Automation: Develop automated processes for data ingestion, transformation, and validation
to improve efficiency and reduce manual intervention.
Mentorship: Guide and mentor junior data engineers, fostering a culture of technical
excellence and continuous learning.
Troubleshooting: Diagnose and resolve complex data-related issues, ensuring high
availability and reliability of data systems.
Required Qualifications
Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science,
or a related field.
Experience: 5+ years of experience in data engineering or a related role, with a proven track
record of building scalable data pipelines and infrastructure.
Technical Skills:
Proficiency in programming languages such as Python, Java, or Scala.
Expertise in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra).
Strong experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services
(e.g., Redshift, BigQuery, Snowflake).
Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, Talend, Informatica) and
data integration frameworks.
Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and distributed
systems.
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a
plus.
Soft Skills:
Excellent problem-solving and analytical skills.
Strong communication and collaboration abilities.
Ability to work in a fast-paced, dynamic environment and manage multiple priorities.
Certifications (optional but preferred): Cloud certifications (e.g., AWS Certified Data Analytics,
Google Professional Data Engineer) or relevant data engineering certifications.
Preferred Qualifica
Experience with real-time data processing and streaming architectures.
Familiarity with machine learning pipelines and MLOps practices.
Knowledge of data visualization tools (e.g., Tableau, Power BI) and their integration with data
pipelines.
Experience in industries with high data complexity, such as finance, healthcare, or
e-commerce.
Work Environment
Location: Hybrid/Remote/On-site (depending on company policy).
Team: Collaborative, cross-functional team environment with data scientists, analysts, and
business stakeholders.
Hours: Full-time, with occasional on-call responsibilities for critical data systems.

Backend Engineer - Python
Location: Bangalore, India
Experience Required: 2-3 years minimum
About Us:
At PGAGI, we believe in a future where AI and human intelligence coexist in harmony, creating a world that is smarter, faster, and better. We are not just building AI; we are shaping a future where AI is a fundamental and positive force for businesses, societies, and the planet.
Job Overview
We are seeking a skilled Backend Engineer with expertise in Python to join our engineering team. The ideal candidate will have hands-on experience building and maintaining enterprise-level, scalable backend systems.
Key Requirements
Technical Skills
- Python Expertise: Advanced proficiency in Python with deep understanding of frameworks like Django, FastAPI, or Flask
- Database Management: Experience with PostgreSQL, MySQL, MongoDB, and database optimization
- API Development: Strong experience in designing and implementing RESTful APIs and GraphQL
- Cloud Platforms: Hands-on experience with AWS, GCP, or Azure services
- Containerization: Proficiency with Docker and Kubernetes
- Message Queues: Experience with Redis, RabbitMQ, or Apache Kafka
- Version Control: Advanced Git workflows and collaboration
Experience Requirements
- Minimum 2-3 years of backend development experience
- Proven track record of working on enterprise-level applications
- Experience building scalable systems handling high traffic loads
- Background in microservices architecture and distributed systems
- Experience with CI/CD pipelines and DevOps practices
Responsibilities
- Design, develop, and maintain robust backend services and APIs
- Optimize application performance and scalability
- Collaborate with frontend teams and product managers
- Implement security best practices and data protection measures
- Write comprehensive tests and maintain code quality
- Participate in code reviews and architectural discussions
- Monitor system performance and troubleshoot production issues
Preferred Qualifications
- Knowledge of caching strategies (Redis, Memcached)
- Understanding of software architecture patterns
- Experience with Agile/Scrum methodologies
- Open source contributions or personal projects
Job description
Job Title: Cloud Migration Consultant – (AWS to Azure)
Experience: 4+ years in application assessment and migration
About the Role
We’re looking for a Cloud Migration Consultant with hands-on experience assessing and migrating complex applications to Azure. You'll work closely with Microsoft business units, participating in Intake & Assessment and Planning & Design phases, creating migration artifacts, and leading client interactions. You’ll also support application modernization efforts in Azure, with exposure to AWS as needed.
Key Responsibilities
- Assess application readiness and document architecture, dependencies, and migration strategy.
- Conduct interviews with stakeholders and generate discovery insights using tools like Azure Migrate, CloudockIt, PowerShell.
- Create architecture diagrams, migration playbooks, and maintain Azure DevOps boards.
- Set up applications both on-premises and in cloud environments (primarily Azure).
- Support proof-of-concepts (PoCs) and advise on migration options.
- Collaborate with application, database, and infrastructure teams to enable smooth transition to migration factory teams.
- Track progress, blockers, and risks, reporting timely status to project leadership.
Required Skills
- 4+ years of experience in cloud migration and assessment
- Strong expertise in Azure IaaS/PaaS (VMs, App Services, ADF, etc.)
- Familiarity with AWS IaaS/PaaS (EC2, RDS, Glue, S3)
- Experience with Java (SpringBoot)/C#, .Net/Python, Angular/React.js, REST APIs
- Working knowledge of Kafka, Docker/Kubernetes, Azure DevOps
- Network infrastructure understanding (VNets, NSGs, Firewalls, WAFs)
- IAM knowledge: OAuth, SAML, Okta/SiteMinder
- Experience with Big Data tools like Databricks, Hadoop, Oracle, DocumentDB
Preferred Qualifications
- Azure or AWS certifications
- Prior experience with enterprise cloud migrations (especially in Microsoft ecosystem)
- Excellent communication and stakeholder management skills
Educational qualification:
B.E/B.Tech/MCA
Experience :
4+ Years
Key Responsibilities
- Assess application readiness and document architecture, dependencies, and migration strategy.
- Conduct interviews with stakeholders and generate discovery insights using tools like Azure Migrate, CloudockIt, PowerShell.
- Create architecture diagrams, migration playbooks, and maintain Azure DevOps boards.
- Set up applications both on-premises and in cloud environments (primarily Azure).
- Support proof-of-concepts (PoCs) and advise on migration options.
- Collaborate with application, database, and infrastructure teams to enable smooth transition to migration factory teams.
- Track progress, blockers, and risks, reporting timely status to project leadership.
Required Skills
- 4+ years of experience in cloud migration and assessment
- Strong expertise in Azure IaaS/PaaS (VMs, App Services, ADF, etc.)
- Familiarity with AWS IaaS/PaaS (EC2, RDS, Glue, S3)
- Experience with Java (SpringBoot)/C#, .Net/Python, Angular/React.js, REST APIs
- Working knowledge of Kafka, Docker/Kubernetes, Azure DevOps
- Network infrastructure understanding (VNets, NSGs, Firewalls, WAFs)
- IAM knowledge: OAuth, SAML, Okta/SiteMinder
- Experience with Big Data tools like Databricks, Hadoop, Oracle, DocumentDB
Preferred Qualifications
- Azure or AWS certifications
- Prior experience with enterprise cloud migrations (especially in Microsoft ecosystem)
- Excellent communication and stakeholder management skills
Job Title: Team Leader
Experience: 7 to 10 years
Location: Bengaluru
About the Role:
We are looking for a talented Java Developer to join our team, focusing on
building and maintaining microservices architecture using Java, Spring Boot,
Redis, and Kafka.
You will play a crucial role in designing, developing, and deploying scalable
and robust applications, ensuring high performance and reliability.
Responsible for handling mission-critical telecom applications.
Responsibilities:
Design and develop high-availability, low-latency telecom applications that are
critical to core network operations and service delivery.
Drive project planning and task delegation, ensuring the timely delivery of
high-quality deliverables.
Lead and mentor a cross-functional team of engineers/developers, providing
technical guidance, performance feedback, and fostering a collaborative team
culture.
Facilitate daily stand-ups, sprint planning, and retrospectives in an
Agile/Scrum environment.
Implement and manage Redis for caching and data storage.
Develop and maintain Kafka-based event streaming pipelines for real-time
data processing.
Troubleshoot and resolve performance issues and bugs.
Optimize microservices for speed, efficiency, and resource utilization
Contribute to the development of a robust and scalable architecture.
Ensure adherence to coding standards and best practices.
Participate in code reviews and knowledge sharing.
Optional: Experience in the telecommunications industry.
Skills Required:
Telecom Knowledge: Understanding of 3G/4G and 5G network & 3GPP
specifications required for Policy & Charging System.
Proficiency in Java: Strong understanding of Java programming principles
and best practices.
Spring Boot: Extensive experience with Spring Boot framework, including
Spring MVC, Spring Data, and Spring Cloud.
Microservices Architecture: Solid understanding of microservices
architecture, including design patterns, communication protocols, and
deployment strategies.
Redis: Experience with Redis as a caching layer and data store.
Kafka: Experience with Apache Kafka for building event-driven applications
and streaming pipelines.
Database: Experience with database technologies (MySQL, PostgreSQL).
Architecture Experience: Proven ability to design and implement scalable
and robust architectures.
Strong Problem-Solving Skills: Ability to identify, analyze, and resolve
complex technical issues.
Excellent Communication Skills: Ability to communicate technical concepts
clearly and effectively.
Teamwork: Ability to collaborate effectively with other developers and
stakeholders.
Good to have:
Experience with containerization technologies (Docker, Kubernetes).
Experience with CI/CD pipelines.
About the Role
We are looking for a skilled Backend Engineer with strong experience in building scalable microservices, integrating with distributed data systems, and deploying web APIs that serve UI applications in the cloud. You’ll work on high-performance systems involving Kafka, DynamoDB, Redis, and other modern backend technologies.
Responsibilities
- Design, develop, and deploy backend microservices and APIs that power UI applications.
- Implement event-driven architectures using Apache Kafka or similar messaging platforms.
- Build scalable and highly available systems using NoSQL databases (e.g., DynamoDB, MongoDB).
- Optimize backend systems using caching layers like Redis to enhance performance.
- Ensure seamless deployment and operation of services in cloud environments (AWS, GCP, or Azure).
- Write clean, maintainable, and well-tested code; contribute to code reviews and architecture discussions.
- Collaborate closely with frontend, DevOps, and product teams to deliver integrated solutions.
- Monitor and troubleshoot production issues and participate in on-call rotations as needed.
Required Qualifications
- 3–7 years of professional experience in backend development.
- Strong programming skills in one or more languages: Java, Python, Go, Node.js.
- Hands-on experience with microservices architecture and API design (REST/gRPC).
- Practical experience with Kafka, RabbitMQ, or other event streaming/message queue systems.
- Solid knowledge of NoSQL databases, especially DynamoDB or equivalents.
- Experience using Redis or Memcached for caching or pub/sub mechanisms.
- Proficiency with cloud platforms (preferably AWS – e.g., Lambda, ECS, EKS, API Gateway).
- Familiarity with Docker, Kubernetes, and CI/CD pipelines.
Hiring for Java Developer
Experience : 5 to 10 yrs
Notice Period : 0 to 15 days
Location : Pune
Work Mode : WFO (5 days)
As Java developer you would be expected to perform many duties throughout the development lifecycle of applications, from concept and design right through to testing. Here are some of the responsibilities you may have:
Develop high-level design and define software architecture
Implement and maintain quality systems within the group
Proficiently estimates, design approaches and nimbly move to alternate apporaches, if needed, develop and execute unit test strategies
Monitor and track tasks, and report status
Assist project heads to conceptualize, design, develop, test and implement technology solutions
Effectively collaborate with stakeholders and users to ensure customer satisfaction
Skill Set :
Java 7 / Java 8 with microservices, Multithreading, Springboot, Junit, kafka, Splunk (Good to have), Open Shift (Good to Have), Authentication/ Spring Security (Good to have)
Job Title : Red Hat PAM Developer
Experience Required :
- Relevant Experience : 4+ Years
- Total Experience : Up to 8 Years
No. of Positions : 4
Work Locations : Hyderabad / Bangalore / Mumbai / Pune / Gurgaon / Chennai
Work Mode : Hybrid
Work Timings : 1:00 PM to 10:00 PM IST
Interview Mode : Virtual
Interview Rounds : 2
Mandatory Skills :
- Excellent communication skills – must be comfortable in client-facing roles
- Red Hat Process Automation Manager (PAM)
- JBPM (Java Business Process Management)
- BPMN 2.0 (Business Process Model and Notation) – low-code platform
- DMN (Decision Model and Notation) – business processes and rules
- Spring Boot
- JavaScript
Good-to-Have Skills :
- Red Hat Fuse
- Apache Kafka
- Apache Camel
Java Developer – Job Description
Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 5 to 12 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

About the Role:
We are seeking a talented Lead Data Engineer to join our team and play a pivotal role in transforming raw data into valuable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and infrastructure to support our organization's analytics and decision-making processes.
Responsibilities:
- Data Pipeline Development: Build and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
- Data Infrastructure: Design, implement, and manage data infrastructure components, including data warehouses, data lakes, and data marts.
- Data Quality: Ensure data quality by implementing data validation, cleansing, and standardization processes.
- Team Management: Able to handle team.
- Performance Optimization: Optimize data pipelines and infrastructure for performance and efficiency.
- Collaboration: Collaborate with data analysts, scientists, and business stakeholders to understand their data needs and translate them into technical requirements.
- Tool and Technology Selection: Evaluate and select appropriate data engineering tools and technologies (e.g., SQL, Python, Spark, Hadoop, cloud platforms).
- Documentation: Create and maintain clear and comprehensive documentation for data pipelines, infrastructure, and processes.
Skills:
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, Databricks).
- Knowledge of cloud platforms (e.g., AWS, GCP, Azure) and cloud-based data services.
- Understanding of data modeling and data architecture concepts.
- Experience with ETL/ELT tools and frameworks.
- Excellent problem-solving and analytical skills.
- Ability to work independently and as part of a team.
Preferred Qualifications:
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Flink).
- Knowledge of machine learning and artificial intelligence concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Certification in cloud platforms or data engineering.

Job Title : Python Django Developer
Experience : 3+ Years
Location : Gurgaon
Working Days : 6 Days (Monday to Saturday)
Job Summary :
We are looking for a skilled Python Django Developer with strong foundational knowledge in backend development, data structures, and operating system concepts.
The ideal candidate should have experience in Django and PostgreSQL, along with excellent logical thinking and multithreading knowledge.
Technical Skills : Python, Django (or Flask), PostgreSQL/MySQL, SQL & NoSQL ORM, REST API development, JSON/XML, strong knowledge of data structures, multithreading, and OS concepts.
Key Responsibilities :
- Write efficient, reusable, testable, and scalable code using the Django framework.
- Develop backend components, server-side logic, and statistical models.
- Design and implement high-availability, low-latency applications with robust data protection and security.
- Contribute to the development of highly responsive web applications.
- Collaborate with cross-functional teams on system design and integration.
Mandatory Skills :
- Strong programming skills in Python and Django (or similar frameworks like Flask).
- Proficiency with PostgreSQL / MySQL and experience in writing complex queries.
- Strong understanding of SQL and NoSQL ORM.
- Solid grasp of data structures, multithreading, and operating system concepts.
- Experience with RESTful API development and implementation of API security.
- Knowledge of JSON/XML and their use in data exchange.
Good-to-Have Skills :
- Experience with Redis, MQTT, and message queues like RabbitMQ or Kafka
- Understanding of microservice architecture and third-party API integrations (e.g., payment gateways, SMS/email APIs)
- Familiarity with MongoDB and other NoSQL databases
- Exposure to data science libraries such as Pandas, NumPy, Scikit-learn
- Knowledge in building and integrating statistical learning models.
Job Role: We are seeking a skilled Java Developer to contribute to the development and enhancement renowned banking application, which supports automatic reconciliation and unified data reporting for their clients. This role involves working on high-impact enhancements, data pipeline integration, and platform modernization. The ideal candidate will be a quick learner, self-motivated, and able to ramp up quickly in a fast-paced environment.
Key Responsibilities:
Design, develop, and maintain Java-based applications using Java 17 and Spring Boot.
Implement and manage message routing using Apache Camel.
Develop and monitor data pipelines using Kafka.
Support and enhance existing cloud-native applications.
Work with OpenShift Container Platform (OCP 4) for container orchestration and deployments.
Utilize Jenkins for CI/CD pipeline automation and management.
Collaborate with cross-functional teams to integrate multiple data sources into a unified reporting platform.
Participate in code reviews, unit testing, and performance tuning.
Troubleshoot and resolve production issues in collaboration with operations teams.
Document development processes and system configurations.
Required Skills:
Strong proficiency in Java 17 and Spring Boot frameworks.
Hands-on experience with Apache Camel for message routing and transformation.
Solid experience in Kafka development and monitoring tools.
Good understanding of cloud pipeline architectures and deployment strategies.
Experience working with OpenShift (OCP 4).
Familiarity with Jenkins for CI/CD and automated deployments.
Understanding of cloud deployment platforms (AWS, Azure, or GCP preferred).
Strong analytical and debugging skills.
Ability to learn quickly and adapt to evolving project requirements.
Nice to Have:
Experience in financial services or transaction reporting platforms.
Familiarity with microservices architecture and containerization best practices.
Knowledge of monitoring tools (e.g., Prometheus, Grafana).

Company Overview
We are a dynamic startup dedicated to empowering small businesses through innovative technology solutions. Our mission is to level the playing field for small businesses by providing them with powerful tools to compete effectively in the digital marketplace. Join us as we revolutionize the way small businesses operate online, bringing innovation and growth to local communities.
Job Description
We are seeking a skilled and experienced Data Engineer to join our team. In this role, you will develop systems on cloud platforms capable of processing millions of interactions daily, leveraging the latest cloud computing and machine learning technologies while creating custom in-house data solutions. The ideal candidate should have hands-on experience with SQL, PL/SQL, and any standard ETL tools. You must be able to thrive in a fast-paced environment and possess a strong passion for coding and problem-solving.
Required Skills and Experience
- Minimum 5 years of experience in software development.
- 3+ years of experience in data management and SQL expertise – PL/SQL, Teradata, and Snowflake experience strongly preferred.
- Expertise in big data technologies such as Hadoop, HiveQL, and Spark (Scala/Python).
- Expertise in cloud technologies – AWS (S3, Glue, Terraform, Lambda, Aurora, Redshift, EMR).
- Experience with queuing systems (e.g., SQS, Kafka) and caching systems (e.g., Ehcache, Memcached).
- Experience with container management tools (e.g., Docker Swarm, Kubernetes).
- Familiarity with data stores, including at least one of the following: Postgres, MongoDB, Cassandra, or Redis.
- Ability to create advanced visualizations and dashboards to communicate complex findings (e.g., Looker Studio, Power BI, Tableau).
- Strong skills in manipulating and transforming complex datasets for in-depth analysis.
- Technical proficiency in writing code in Python and advanced SQL queries.
- Knowledge of AI/ML infrastructure, best practices, and tools is a plus.
- Experience in analyzing and resolving code issues.
- Hands-on experience with software architecture concepts such as Separation of Concerns (SoC) and micro frontends with theme packages.
- Proficiency with the Git version control system.
- Experience with Agile development methodologies.
- Strong problem-solving skills and the ability to learn quickly.
- Exposure to Docker and Kubernetes.
- Familiarity with AWS or other cloud platforms.
Responsibilities
- Develop and maintain our inhouse search and reporting platform
- Create data solutions to complement core products to improve performance and data quality
- Collaborate with the development team to design, develop, and maintain our suite of products.
- Write clean, efficient, and maintainable code, adhering to coding standards and best practices.
- Participate in code reviews and testing to ensure high-quality code.
- Troubleshoot and debug application issues as needed.
- Stay up-to-date with emerging trends and technologies in the development community.
How to apply?
- If you are passionate about designing user-centric products and want to be part of a forward-thinking company, we would love to hear from you. Please send your resume, a brief cover letter outlining your experience and your current CTC (Cost to Company) as a part of the application.
Join us in shaping the future of e-commerce!
Job Description: We are looking for a talented and motivated Software Engineer with
expertise in both Windows and Linux operating systems and solid experience in Java
technologies. The ideal candidate should be proficient in data structures and algorithms, as
well as frameworks like Spring MVC, Spring Boot, and Hibernate. Hands-on experience
working with MySQL databases is also essential for this role.
Responsibilities:
● Design, develop, test, and maintain software applications using Java technologies.
● Implement robust solutions using Spring MVC, Spring Boot, and Hibernate frameworks.
● Develop and optimize database operations with MySQL.
● Analyze and solve complex problems by applying knowledge of data structures and
algorithms.
● Work with both Windows and Linux environments to develop and deploy solutions.
● Collaborate with cross-functional teams to deliver high-quality products on time.
● Ensure application security, performance, and scalability.
● Maintain thorough documentation of technical solutions and processes.
● Debug, troubleshoot, and upgrade legacy systems when required.
Requirements:
● Operating Systems: Expertise in Windows and Linux environments.
● Programming Languages & Technologies: Strong knowledge of Java (Core Java, Java 8+).
● Frameworks: Proficiency in Spring MVC, Spring Boot, and Hibernate.
● Algorithms and Data Structures: Good understanding and practical application of DSA
concepts.
● Databases: Experience with MySQL – writing queries, stored procedures, and performance
tuning.
● Version Control Systems: Experience with tools like Git.
● Deployment: Knowledge of CI/CD pipelines and tools such as Jenkins, Docker (optional)
Required Skillset
• Experience in Core Java 1.8 and above, Data Structures, OOPS, Multithreading, Algorithms, Collections, System Design, Unix/Linux. • Possess good architectural knowledge and be aware of enterprise application design patterns. • Should be able to analyze, design, develop and test complex, low-latency client-facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-volume server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills in Java. • Strong interpersonal, communication and analytical skills. • Should be able to express their design ideas and thoughts.
Job Brief-
• Understand product requirements and come up with solution approaches. • Build and enhance large scale domain centric applications. • Deploy high quality deliverables into production adhering to the security, compliance and SDLC guidelines.
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Key Responsibilities:
- Design, develop, and maintain robust and scalable backend applications using Core Java and Spring Boot.
- Build and manage microservices-based architectures and ensure smooth inter-service communication.
- Integrate and manage real-time data streaming using Apache Kafka.
- Write clean, maintainable, and efficient code following best practices.
- Collaborate with cross-functional teams including QA, DevOps, and product management.
- Participate in code reviews and provide constructive feedback.
- Troubleshoot, debug, and optimize applications for performance and scalability.
Required Skills:
- Strong knowledge of Core Java (Java 8 or above).
- Hands-on experience with Spring Boot and the Spring ecosystem (Spring MVC, Spring Data, Spring Security).
- Experience in designing and developing RESTful APIs.
- Solid understanding of Microservices architecture and related patterns.
- Practical experience with Apache Kafka for real-time data processing.
- Familiarity with SQL/NoSQL databases such as MySQL, PostgreSQL, or MongoDB.
- Good understanding of CI/CD tools and practices.
- Knowledge of containerization tools like Docker is a plus.
- Strong problem-solving skills and attention to detail.
We are looking for passionate people who love solving interesting and complex technology challenges, who are enthusiastic about building an industry first innovative product to solve new age real world problems. This role requires strategic leadership, the ability to manage complex technical challenges, and the ability to drive innovation while ensuring operational excellence. As a Backend SDE-2, you will collaborate with key stakeholders across the business, product management, and operations to ensure alignment with the organization's goals and play a critical role in shaping the technology roadmap and engineering culture.
Key Responsibilities
- Strategic Planning: Work closely with senior leadership to develop and implement engineering strategies that support business objectives. Understand broader organization goals and prepare technology roadmaps.
- Technical Excellence: Guide the team in designing and implementing scalable, extensible and secure software systems. Drive the adoption of best practices in technical architecture, coding standards, and software testing to ensure product delivery with highest speed AND quality.
- Project and Program Management: Setting up aggressive as well as realistic timelines with all the stakeholders, ensure the successful delivery of engineering projects as per the defined timelines with best quality standards ensuring budget constraints are met. Use agile methodologies to manage the development process and resolve bottlenecks.
- Cross-functional collaboration: Collaborate with Product Management, Design, Business, and Operations teams to define project requirements and deliverables. Ensure the smooth integration of engineering efforts across the organization.
- Risk Management: Anticipate and mitigate technical risks and roadblocks. Proactively identify areas of technical debt and drive initiatives to reduce it.
Required Qualifications
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- 1-3 years of experience in software engineering
- Excellent problem-solving skills, with the ability to diagnose and resolve complex technical challenges.
- Proven track record of successfully delivering large-scale, high-impact software projects.
- Strong understanding of software design principles and patterns.
- Expertise in multiple programming languages and modern development frameworks.
- Experience with cloud infrastructure (AWS), microservices, and distributed systems.
- Experience with releational and non-relational databases.
- Experience with Redis, ElasticSearch.
- Experience in DevOps, CI/CD pipelines, and infrastructure automation.
- Strong communication and interpersonal skills, with the ability to influence and inspire teams and stakeholders at all levels.
Skills:- MySQL, Python, Django, AWS, NoSQL, Kafka, Redis, ElasticSearch
Job Title: Tech Lead and SSE – Kafka, Python, and Azure Databricks (Healthcare Data Project)
Experience: 4 to 12 years
Role Overview:
We are looking for a highly skilled Tech Lead with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams.
Key Responsibilities:
- Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks.
- Architect scalable data streaming and processing solutions to support healthcare data workflows.
- Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data.
- Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.).
- Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions.
- Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows.
- Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering.
- Stay updated with the latest cloud technologies, big data frameworks, and industry trends.
Required Skills & Qualifications:
- 4+ years of experience in data engineering, with strong proficiency in Kafka and Python.
- Expertise in Kafka Streams, Kafka Connect, and Schema Registry for real-time data processing.
- Experience with Azure Databricks (or willingness to learn and adopt it quickly).
- Hands-on experience with cloud platforms (Azure preferred, AWS or GCP is a plus).
- Proficiency in SQL, NoSQL databases, and data modeling for big data processing.
- Knowledge of containerization (Docker, Kubernetes) and CI/CD pipelines for data applications.
- Experience working with healthcare data (EHR, claims, HL7, FHIR, etc.) is a plus.
- Strong analytical skills, problem-solving mindset, and ability to lead complex data projects.
- Excellent communication and stakeholder management skills.
We're seeking passionate, next-gen minded engineers who are excited about solving complex technical challenges and building innovative, first-of-its-kind products which make a tangible difference for our customers. As a Backend SDE-1, you will play a key role in driving strategic initiatives, collaborating with cross-functional teams across business, product, and operations to solve exciting problems. This role demands strong technical acumen, leadership capabilities, and a mindset focused on innovation and operational excellence.
We value individuals who think independently, challenge the status quo, and bring creativity and curiosity to the table—not those who simply follow instructions. If you're passionate about solving problems and making an impact, we'd love to hear from you.
Key Responsibilities
- Strategic Planning: Work closely with senior leadership to develop and implement engineering strategies that support business objectives. Understand broader organization goals and constantly prioritise your own work.
- Technical Excellence: Understand the onground problems, explore and design various possible solutions to conclude and implement scalable, extensible and secure software systems. Implement and learn best practices in technical architecture, coding standards, and software testing to ensure product delivery with highest speed AND quality.
- Project and Program Management: Setting up aggressive as well as realistic timelines with all the stakeholders, ensure the successful delivery of engineering projects as per the defined timelines with best quality standards ensuring budget constraints are met. Use agile methodologies to manage the development process and resolve bottlenecks.
- Cross-functional collaboration: Collaborate with Product Managers, Design, Business, and Operations teams to define project requirements and deliverables. Ensure the smooth integration of engineering efforts across the organization.
- Risk Management: Anticipate and mitigate technical risks and roadblocks. Proactively identify areas of technical debt and drive initiatives to reduce it.
Required Qualifications
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- 1+ years of experience in software engineering
- Excellent problem-solving skills, with the ability to diagnose and resolve complex technical challenges.
- Strong understanding of software design principles and patterns.
- Hands on with multiple programming languages and modern development frameworks.
- Understanding of relational and non-relational databases.
- Experience with Redis, ElasticSearch.
- Strong communication and interpersonal skills, with the ability to influence and inspire teams and stakeholders at all levels.
Skills:- MySQL, Python, Django, AWS, NoSQL, Kafka, Redis, ElasticSearch


Title: Senior Software Engineer – Python (Remote: Africa, India, Portugal)
Experience: 9 to 12 Years
INR : 40 LPA - 50 LPA
Location Requirement: Candidates must be based in Africa, India, or Portugal. Applicants outside these regions will not be considered.
Must-Have Qualifications:
- 8+ years in software development with expertise in Python
- kubernetes is important
- Strong understanding of async frameworks (e.g., asyncio)
- Experience with FastAPI, Flask, or Django for microservices
- Proficiency with Docker and Kubernetes/AWS ECS
- Familiarity with AWS, Azure, or GCP and IaC tools (CDK, Terraform)
- Knowledge of SQL and NoSQL databases (PostgreSQL, Cassandra, DynamoDB)
- Exposure to GenAI tools and LLM APIs (e.g., LangChain)
- CI/CD and DevOps best practices
- Strong communication and mentorship skills
Role & Responsibilities
Responsible for ensuring that the architecture and design of the platform remains top-notch with respect to scalability, availability, reliability and maintainability
Act as a key technical contributor as well as a hands-on contributing member of the team.
Own end-to-end availability and performance of features, driving rapid product innovation while ensuring a reliable service.
Working closely with the various stakeholders like Program Managers, Product Managers, Reliability and Continuity Engineering(RCE) team, QE team to estimate and execute features/tasks independently.
Maintain and drive tech backlog execution for non-functional requirements of the platform required to keep the platform resilient
Assist in release planning and prioritization based on technical feasibility and engineering constraints
A zeal to continually find new ways to improve architecture, design and ensure timely delivery and high quality.
WHO WE ARE
We are a team of digital practitioners with roots stretching back to the earliest days of online commerce, who dedicate themselves to serving our client companies.
We’ve seen the advancements first-hand over the last 25 years and believe our experiences allow us to innovate. Utilizing cutting-edge technology and providing bespoke, innovative services, we believe we can help you stay ahead of the curve.
We take a holistic view of digital strategy. Our approach to transformation is based on conscious Intent to delight customers through continuous Insight and creative Innovation with an enduring culture of problem-solving.
We bring every element together to create innovative, high-performing commerce experiences for enterprise B2C, B2B, D2C and Marketplace brands across industries. From mapping out business and functional requirements, to developing the infrastructure to optimize traditionally fragmented processes, we help you create integrated, future-proofed commerce solutions.
WHAT YOU’LL BE DOING
As part of our team, you'll play a key role in building and evolving our Integration Platform as a Service (iPaaS) solution. This platform empowers our clients to seamlessly connect systems, automate workflows, and scale integrations with modern cloud-native tools.
Here’s what your day-to-day will look like:
- Designing and Building Integrations
- Collaborate with clients to understand integration needs and build scalable, event-driven solutions using Apache Kafka, AWS Lambda, API Gateway, and EventBridge.
- Cloud-Native Development
- Develop and deploy microservices and serverless functions using TypeScript (Node.js), hosted on Kubernetes (EKS) and fully integrated with core AWS services like S3, SQS, and SNS.
- Managing Data Pipelines
- Build robust data flows and streaming pipelines using Kafka and NoSQL databases like MongoDB, ensuring high availability and fault tolerance.
- Client Collaboration
- Work directly with customers to gather requirements, design integration patterns, and provide guidance on best practices for cloud-native architectures.
- Driving Platform Evolution
- Contribute to the ongoing improvement of our iPaaS platform—enhancing observability, scaling capabilities, and CI/CD processes using modern DevOps practices.
WHAT WE NEED IN YOU
- Solid Experience in Apache Kafka for data streaming and event-driven systems
- Production experience with Kubernetes (EKS) and containerized deployments
- Deep knowledge of AWS, including S3, EC2, SQS, SNS, EventBridge, Lambda
- Proficient in TypeScript (Node.js environment)
- Experience with MongoDB or other NoSQL databases
- Familiarity with microservices architecture, async messaging, and DevOps practices
- AWS Certification (e.g., Solutions Architect or Developer Associate) is a plus
Qualification
- Graduate - BE / Btech or equivalent.
- 5 to 8 years of experience
- Self motivated and quick learner with excellent problem solving skills.
- A good team player with nice communication skills.
- Energy and real passion to work in a startup environment.
Visit our website - https://www.trikatechnologies.com
We are in search of a proficient Java Principal Engineer with a minimum of 10 years' experience in designing and developing Java applications. The ideal candidate will demonstrate a deep understanding of Java technologies, including Java EE, Spring Framework, and Hibernate. Proficiency in database technologies such as MySQL, Oracle, or PostgreSQL is essential, along with a proven track record of delivering high-quality, scalable, and efficient Java solutions.
We are looking for you!
You are a team player, get-it-done person, intellectually curious, customer focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You have the zeal to think differently, understand that career is a journey and make the choices right. You must have experience in creating visually compelling designs that effectively communicate our message and engage our target audience. Ideal candidates would be someone who is creative, proactive, go getter and motivated to look for ways to add value to job accomplishments.
As an ideal candidate for the Java Lead position, you bring a wealth of experience and expertise in Java development, combined with strong leadership qualities. Your proven track record showcases your ability to lead and mentor teams to deliver high-quality, enterprise-grade applications.
Your technical proficiency and commitment to excellence make you a valuable asset in driving innovation and success within our development projects. You possess a team-oriented mindset and a "get-it-done" attitude, inspiring your team members to excel and collaborate effectively.
You have a proven ability to lead mid to large size teams, emphasizing a quality-first approach and ensuring that projects are delivered on time and within scope. As a Java Lead, you are responsible for overseeing project planning, implementing best practices, and driving technical solutions that align with business objectives.
You collaborate closely with development managers, architects, and cross-functional teams to design scalable and robust Java applications.
Your proactive nature and methodical approach enable you to identify areas for improvement, mentor team members, and foster a culture of continuous learning and growth.
Your leadership style, technical acumen, and dedication to delivering excellence make you an ideal candidate to lead our Java development initiatives and contribute significantly to the success and innovation of our organization.
What You Will Do:
- Design and development of RESTful Web Services.
- Hands on database experience (Oracle / PostgreSQL / MySQL /SQL Server).
- Hands on experience with developing web applications leveraging Spring Framework.
- Hands on experience with developing microservices leveraging Spring Boot.
- Experience with cloud platforms (e.g., AWS, Azure) and containerization technologies.
- Continuous Integration tools (Jenkins & Git Lab), CICD Tools.
- Strong believer and follower of agile methodologies with an emphasis on Quality & Standards based development.
- Architect, design, and implement complex software systems using [Specify relevant technologies, e.g., Java, Python, Node.js.
What we need?
- BTech computer science or equivalent
- Minimum 10+ years of relevant experience in Java/J2EE technologies
- Experience in building back in API using Spring Boot Framework, Spring DI, Spring AOP
- Real time messaging integration using Kafka or similar framework
- Experience in at least one database: Oracle, SQL server or PostgreSQL
- Previous experience managing and leading high-performing software engineering teams.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
As a Lead Java Developer, you will take charge of driving the development and delivery of high-quality Java-based applications. Your role will involve leading a team of developers, providing technical guidance, and overseeing the entire software development life cycle. With your deep understanding of Java programming and related frameworks, you will design and implement scalable and efficient solutions that meet the project requirements. Your strong problem-solving skills and attention to detail will ensure the code quality and performance of the applications. Additionally, you will stay updated with the latest industry trends and best practices to improve the development processes continuously and contribute to the success of the team.
What You Will Do:
- Design and development of RESTful Web Services.
- Hands on database experience (Oracle / PostgreSQL / MySQL /SQL Server).
- Hands on experience with developing web applications leveraging Spring Framework.
- Hands on experience with developing microservices leveraging Spring Boot.
- Experience with cloud platforms (e.g., AWS, Azure) and containerization technologies.
- Continuous Integration tools (Jenkins & Git Lab), CICD Tools.
- Strong believer and follower of agile methodologies with an emphasis on Quality & Standards based development.
- Architect, design, and implement complex software systems using [Specify relevant technologies, e.g., Java, Python, Node.js.
What we need?
- BTech computer science or equivalent
- Minimum 8+ years of relevant experience in Java/J2EE technologies
- Experience in building back in API using Spring Boot Framework, Spring DI, Spring AOP
- Real time messaging integration using Kafka or similar framework
- Experience in at least one database: Oracle, SQL server or PostgreSQL
- Previous experience managing and leading high-performing software engineering teams.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Must be:
- Based in Mumbai
- Comfortable with Work from Office
- Available to join immediately
Responsibilities:
- Manage, monitor, and scale production systems across cloud (AWS/GCP) and on-prem.
- Work with Kubernetes, Docker, Lambdas to build reliable, scalable infrastructure.
- Build tools and automation using Python, Go, or relevant scripting languages.
- Ensure system observability using tools like NewRelic, Prometheus, Grafana, CloudWatch, PagerDuty.
- Optimize for performance and low-latency in real-time systems using Kafka, gRPC, RTP.
- Use Terraform, CloudFormation, Ansible, Chef, Puppet for infra automation and orchestration.
- Load testing using Gatling, JMeter, and ensuring fault tolerance and high availability.
- Collaborate with dev teams and participate in on-call rotations.
Requirements:
- B.E./B.Tech in CS, Engineering or equivalent experience.
- 3+ years in production infra and cloud-based systems.
- Strong background in Linux (RHEL/CentOS) and shell scripting.
- Experience managing hybrid infrastructure (cloud + on-prem).
- Strong testing practices and code quality focus.
- Experience leading teams is a plus.
📍 Position : Java Architect
📅 Experience : 10 to 15 Years
🧑💼 Open Positions : 3+
📍 Work Location : Bangalore, Pune, Chennai
💼 Work Mode : Hybrid
📅 Notice Period : Immediate joiners preferred; up to 1 month maximum
🔧 Core Responsibilities :
- Lead architecture design and development for scalable enterprise-level applications.
- Own and manage all aspects of technical development and delivery.
- Define and enforce best coding practices, architectural guidelines, and development standards.
- Plan and estimate the end-to-end technical scope of projects.
- Conduct code reviews, ensure CI/CD, and implement TDD/BDD methodologies.
- Mentor and lead individual contributors and small development teams.
- Collaborate with cross-functional teams, including DevOps, Product, and QA.
- Engage in high-level and low-level design (HLD/LLD), solutioning, and cloud-native transformations.
🛠️ Required Technical Skills :
- Strong hands-on expertise in Java, Spring Boot, Microservices architecture
- Experience with Kafka or similar messaging/event streaming platforms
- Proficiency in cloud platforms – AWS and Azure (must-have)
- Exposure to frontend technologies (nice-to-have)
- Solid understanding of HLD, system architecture, and design patterns
- Good grasp of DevOps concepts, Docker, Kubernetes, and Infrastructure as Code (IaC)
- Agile/Lean development, Pair Programming, and Continuous Integration practices
- Polyglot mindset is a plus (Scala, Golang, Python, etc.)
🚀 Ideal Candidate Profile :
- Currently working in a product-based environment
- Already functioning as an Architect or Principal Engineer
- Proven track record as an Individual Contributor (IC)
- Strong engineering fundamentals with a passion for scalable software systems
- No compromise on code quality, craftsmanship, and best practices
🧪 Interview Process :
- Round 1: Technical pairing round
- Rounds 2 & 3: Technical rounds with panel (code pairing + architecture)
- Final Round: HR and offer discussion
Role & Responsibilities
About the Role:
We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.
Key responsibilities:
Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.
Implement efficient data modeling techniques to optimize performance and scalability of data systems.
Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.
Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.
Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.
Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.
Design and develop batch pipelines for scheduled data processing tasks.
Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.
Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.
Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.


Role & Responsibilities
Lead the design, development, and deployment of complex, scalable, reliable, and highly available features for world-class SaaS products and services.
Guide the engineering team in adopting best practices for software development, code quality, and architecture.
Make strategic architectural and technical decisions, ensuring the scalability, security, and performance of software applications.
Proactively identify, prioritize, and address technical debt to improve system performance, maintainability, and long-term scalability, ensuring a solid foundation for future development.
Collaborate with cross-functional teams (product managers, designers, and stakeholders) to define project scope, requirements, and timelines.
Mentor and coach team members, providing technical guidance and fostering professional development.
Oversee code reviews, ensuring adherence to best practices and maintaining high code quality standards.
Drive continuous improvement in development processes, tools, and technologies to increase team productivity and product quality.
Stay updated with the latest industry trends and emerging technologies to drive innovation and keep the team at the cutting edge.
Ensure project timelines and goals are met, managing risks and resolving any technical challenges that arise during development.
Foster a collaborative and inclusive team culture, promoting open communication and problem-solving.
Imbibe and maintain a strong customer delight attitude while designing and building products.
What we Require
We are recruiting technical experts with the following core skills and hands-on experience on
Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot
Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes
Expert proficiency in designing both producer and consumer types of Rest services.
Expert proficiency in Unit testing and Code Quality tools.
Expert proficiency in ensuring code coverage.
Expert proficiency in understanding High-Level Design and translating that to Low-Level design
Hands-on experience working with no-SQL databases.
Experience working in an Agile development process - Scrum.
Experience working closely with engineers and software cultures.
Ability to think at a high level about product strategy and customer journeys.
Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.
Excellent communication skills to clearly articulate design decisions.
JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.
Mon-Fri, In office role with excellent perks and benefits!
Key Responsibilities:
1. Design, develop, and maintain backend services and APIs using Node.js or Python, or Java.
2. Build and implement scalable and robust microservices and integrate API gateways.
3. Develop and optimize NoSQL database structures and queries (e.g., MongoDB, DynamoDB).
4. Implement real-time data pipelines using Kafka.
5. Collaborate with front-end developers to ensure seamless integration of backend services.
6. Write clean, reusable, and efficient code following best practices, including design patterns.
7. Troubleshoot, debug, and enhance existing systems for improved performance.
Mandatory Skills:
1. Proficiency in at least one backend technology: Node.js or Python, or Java.
2. Strong experience in:
i. Microservices architecture,
ii. API gateways,
iii. NoSQL databases (e.g., MongoDB, DynamoDB),
iv. Kafka
v. Data structures (e.g., arrays, linked lists, trees).
3. Frameworks:
i. If Java : Spring framework for backend development.
ii. If Python: FastAPI/Django frameworks for AI applications.
iii. If Node: Express.js for Node.js development.
Good to Have Skills:
1. Experience with Kubernetes for container orchestration.
2. Familiarity with in-memory databases like Redis or Memcached.
3. Frontend skills: Basic knowledge of HTML, CSS, JavaScript, or frameworks like React.js.
Architect
Experience - 12+ yrs
About Wekan Enterprise Solutions
Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.
Job Description
We are looking for passionate architects eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments leading technical teams, designing system architecture and reviewing peer code. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes?
You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.
Location - Chennai or Bangalore
● Relevant experience of 12+ years building high-performance applications with at least 3+ years as an architect.
● Good problem solving skills
● Strong mentoring capabilities
● Good understanding of software development life cycle
● Strong experience in system design and architecture
● Strong focus on quality of work delivered
● Excellent verbal and written communication skills
Required Technical Skills
● Extensive hands-on experience building high-performance applications using Node.Js (Javascript/Typescript) and .NET/ Golang / Java / Python.
● Strong experience with appropriate framework(s).
● Wellversed in monolithic and microservices architecture.
● Hands-on experience with data modeling on MongoDB and any other Relational or NoSQL databases
● Experience working with 3rd party integrations ranging from authentication, cloud services, etc.
● Hands-on experience with Kafka or RabbitMQ.
● Handsonexperience with CI/CD pipelines and atleast 1 cloud provider- AWS / GCP / Azure
● Strong experience writing and maintaining clear documentation
Good to have skills:
● Experience working with frontend technologies - React.Js or Vue.Js or Angular.
● Extensive experience consulting with customers directly for defining architecture or system design.
● Technical certifications in AWS / Azure / GCP / MongoDB or other relevant technologies

Role & Responsibilities
Lead and mentor a team of data engineers, ensuring high performance and career growth.
Architect and optimize scalable data infrastructure, ensuring high availability and reliability.
Drive the development and implementation of data governance frameworks and best practices.
Work closely with cross-functional teams to define and execute a data roadmap.
Optimize data processing workflows for performance and cost efficiency.
Ensure data security, compliance, and quality across all data platforms.
Foster a culture of innovation and technical excellence within the data team.


Role & Responsibilities
Lead and mentor a team of data engineers, ensuring high performance and career growth.
Architect and optimize scalable data infrastructure, ensuring high availability and reliability.
Drive the development and implementation of data governance frameworks and best practices.
Work closely with cross-functional teams to define and execute a data roadmap.
Optimize data processing workflows for performance and cost efficiency.
Ensure data security, compliance, and quality across all data platforms.
Foster a culture of innovation and technical excellence within the data team.


Role & Responsibilities
Lead and mentor a team of data engineers, ensuring high performance and career growth.
Architect and optimize scalable data infrastructure, ensuring high availability and reliability.
Drive the development and implementation of data governance frameworks and best practices.
Work closely with cross-functional teams to define and execute a data roadmap.
Optimize data processing workflows for performance and cost efficiency.
Ensure data security, compliance, and quality across all data platforms.
Foster a culture of innovation and technical excellence within the data team.
Job Title: Big Data Engineer (Java Spark Developer – JAVA SPARK EXP IS MUST)
Location: Chennai, Hyderabad, Pune, Bangalore (Bengaluru) / NCR Delhi
Client: Premium Tier 1 Company
Payroll: Direct Client
Employment Type: Full time / Perm
Experience: 7+ years
Job Description:
We are looking for a skilled Big Data Engineers using Java Spark with 7+ years of experience in Big Data / legacy platforms, who can join immediately. Desired candidate should have design, development and optimization of real-time & batch data pipelines experience in Big Data environment at an enterprise scale applications. You will work on building scalable and high-performance data processing solutions, integrating real-time data streams, and building a reliable Data platforms. Strong troubleshooting, performance tuning, and collaboration skills are key for this role.
Key Responsibilities:
· Develop data pipelines using Java Spark and Kafka.
· Optimize and maintain real-time data pipelines and messaging systems.
· Collaborate with cross-functional teams to deliver scalable data solutions.
· Troubleshoot and resolve issues in Java Spark and Kafka applications.
Qualifications:
· Experience in Java Spark is must
· Knowledge and hands-on experience using distributed computing, real-time data streaming, and big data technologies
· Strong problem-solving and performance optimization skills
· Looking for immediate joiners
If interested, please share your resume along with the following details
1) Notice Period
2) Current CTC
3) Expected CTC
4) Have Experience in Java Spark - Y / N (this is must)
5) Any offers in hand
Thanks & Regards,
LION & ELEPHANTS CONSULTANCY PVT LTD TEAM
SINGAPORE | INDIA
We are seeking a skilled Java Developer with 5+ years of experience in Java, Camunda, Apache Camel, Kafka, and Apache Karaf. The ideal candidate should have expertise in workflow automation, message-driven architectures, and enterprise integration patterns. Strong problem-solving skills and hands-on experience in microservices and event-driven systems are required.
Mandatory Skills:
- AZ-104 (Azure Administrator) experience
- CI/CD migration expertise
- Proficiency in Windows deployment and support
- Infrastructure as Code (IaC) in Terraform
- Automation using PowerShell
- Understanding of SDLC for C# applications (build/ship/run strategy)
- Apache Kafka experience
- Azure web app
Good to Have Skills:
- AZ-400 (Azure DevOps Engineer Expert)
- AZ-700 Designing and Implementing Microsoft Azure Networking Solutions
- Apache Pulsar
- Windows containers
- Active Directory and DNS
- SAST and DAST tool understanding
- MSSQL database
- Postgres database
- Azure security
Position Title : Java Full Stack Developer
Location : Noida Sector 125
Experience : 5+ Years
Availability : Immediate
Job Summary :
We are looking for a Java Full Stack Developer with expertise in Microservices architecture to join our team.
The ideal candidate should have hands-on experience in Java, Spring Boot, Hibernate, and front-end technologies like Angular, JavaScript, and Bootstrap. You will work on enterprise-grade applications that enhance patient safety worldwide.
Key Responsibilities :
- Design, develop, and maintain applications based on Microservices architecture.
- Work with Java, Spring Boot, Hibernate, Angular, Kafka, Redis, and Hazelcast to build scalable solutions.
- Utilize AWS, Git, Nginx, Tomcat, Oracle, Jira, Confluence, and Jenkins for development and deployment.
- Collaborate with cross-functional teams to develop enterprise applications.
- Develop intuitive UI/UX components using Bootstrap, jQuery, and JavaScript.
- Ensure applications meet performance, scalability, and security requirements.
- Participate in Agile development while efficiently handling changing priorities.
- Conduct code reviews, debugging, and performance optimization.
Required Skills & Qualifications :
✔ 5+ Years of hands-on experience in Java 7/8, Spring Boot, and Hibernate.
✔ Strong understanding of OOP concepts and Design Patterns.
✔ Experience working with relational databases like Oracle/MySQL.
✔ Proficiency in Bootstrap, JavaScript, jQuery, HTML, and Angular.
✔ Hands-on experience in Microservices-based application development.
✔ Strong problem-solving, debugging, and analytical skills.
✔ Excellent communication and collaboration skills.
✔ Ability to adapt to new technologies and handle multiple priorities.
✔ Experience in developing high-quality web applications.
Good to Have :
➕ Exposure to Kafka, Redis, and Hazelcast.
➕ Experience working with cloud-based solutions (AWS preferred).
➕ Familiarity with DevOps tools like Jenkins, Docker, and Kubernetes.
Why Join Us?
✅ Work on cutting-edge technologies and enterprise-level applications.
✅ Collaborative and innovative work environment.
✅ Competitive salary and career growth opportunities.


Job Role: Senior Full Stack Developer
Location: Trichy
Job Type: Full Time
Experience Required: 5+ Years
Reporting to : Product Head
About Us:
At Zybisys Consulting Services LLP, we are a leading company in Cloud Managed Services and Cloud Computing. We believe in creating a vibrant and inclusive workplace where talented people can grow and succeed. We are looking for a dedicated leader who is passionate about supporting our team, developing talent, and enhancing our company culture.
Role Overview:
Are you a seasoned Full Stack Developer with a passion for crafting innovative solutions? We are looking for an experienced Senior Full Stack Developer to enhance our team and lead the development of innovative solutions.
Key Responsibilities:
- Develop and Maintain Applications: Design, develop, and maintain scalable and efficient full-stack applications using modern technologies.
- Database Design: Expertise in both relational and NoSQL databases, including schema design, query optimization, and data modeling.
- Collaborate with Teams: Work closely with front-end and back-end developers along with the Engineering team to integrate and optimize APIs and services.
- Implement Best Practices: Ensure high-quality code, adherence to best practices, and efficient use of technologies.
- Troubleshoot and Debug: Identify and resolve complex issues, providing solutions and improvements.
- Code Review and Quality Assurance: Skill in reviewing code, ensuring adherence to coding standards, and implementing best practices for software quality.
- Agile Methodologies: Experience with Agile frameworks (e.g., Scrum, Kanban) to facilitate iterative development and continuous improvement.
- Test-Driven Development (TDD): Knowledge of TDD practices, writing unit tests, and integrating automated testing (CI/CD) into the development workflow.
- Technical Documentation: Ability to write clear and concise technical documentation for codebases, APIs, and system architecture.
Technical Skills:
- Backend: Node.js, Express.js, Python, Golang, gRPC
- Frontend: React.js, Next.js, HTML, HTML5, CSS3, jQuery
- Database: MongoDB, MySQL, Redis, OpenSearch
- API : RESTful APIs, SOAP services, or GraphQL
- Tools & Technologies: Docker, Git, Kafka
- Design & Development: Figma, Linux
- Containers & container orchestration: Docker, Kubernetes
- Networking & OS Knowledge
What We Offer:
- Growth Opportunities: Expand your skills and career within a forward-thinking company.
- Collaborative Environment: Join a team that values innovation and teamwork.
If your ready to take on exciting challenges and work in a collaborative environment, wed love to hear from you!
Apply now to join our team as a Senior Full Stack Developer and make waves with your skills!
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Job description
Location: Chennai, India
Experience: 5+ Years
Certification: Kafka Certified (Mandatory); Additional Certifications are a Plus
Job Overview:
We are seeking an experienced DevOps Engineer specializing in GCP Cloud Infrastructure Management and Kafka Administration. The ideal candidate should have 5+ years of experience in cloud technologies, Kubernetes, and Kafka, with a mandatory Kafka certification.
Key Responsibilities:
Cloud Infrastructure Management:
· Manage and update Kubernetes (K8s) on GKE.
· Monitor and optimize K8s resources, including pods, storage, memory, and costs.
· Oversee the general monitoring and maintenance of environments using:
o OpenSearch / Kibana
o KafkaUI
o BGP
o Grafana / Prometheus
Kafka Administration:
· Manage Kafka brokers and ACLs.
· Hands-on experience in Kafka administration (preferably Confluent Kafka).
· Independently debug, optimize, and implement Kafka solutions based on developer and business needs.
Other Responsibilities:
· Perform random investigations to troubleshoot and enhance infrastructure.
· Manage PostgreSQL databases efficiently.
· Administer Jenkins pipelines, supporting CI/CD implementation and maintenance.
Required Skills & Qualifications:
· Kafka Certified Engineer (Mandatory).
· 5+ years of experience in GCP DevOps, Cloud Infrastructure, and Kafka Administration.
· Strong expertise in Kubernetes (K8s), Google Kubernetes Engine (GKE), and cloud environments.
· Hands-on experience with monitoring tools like Grafana, Prometheus, OpenSearch, and Kibana.
· Experience managing PostgreSQL databases.
· Proficiency in Jenkins pipeline administration.
· Ability to work independently and collaborate with developers and business stakeholders.
If you are passionate about DevOps, Cloud Infrastructure, and Kafka, and meet the above qualifications, we encourage you to apply!
Job Title : Senior AWS Data Engineer
Experience : 5+ Years
Location : Gurugram
Employment Type : Full-Time
Job Summary :
Seeking a Senior AWS Data Engineer with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.
Key Responsibilities :
- Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
- Maintain data lakes & warehouses for analytics.
- Ensure data integrity through quality checks.
- Collaborate with data scientists & engineers to deliver solutions.
Qualifications :
- 7+ Years in Data Engineering.
- Expertise in AWS services, SQL, Python, Spark, Kafka.
- Experience with CI/CD, DevOps practices.
- Strong problem-solving skills.
Preferred Skills :
- Experience with Snowflake, Databricks.
- Knowledge of BI tools (Tableau, Power BI).
- Healthcare/Insurance domain experience is a plus.
Job Title : Tech Lead - Data Engineering (AWS, 7+ Years)
Location : Gurugram
Employment Type : Full-Time
Job Summary :
Seeking a Tech Lead - Data Engineering with expertise in AWS to design, build, and optimize scalable data pipelines and data architectures. The ideal candidate will have experience in ETL/ELT, data warehousing, and big data technologies.
Key Responsibilities :
- Build and optimize data pipelines using AWS (Glue, EMR, Redshift, S3, etc.).
- Maintain data lakes & warehouses for analytics.
- Ensure data integrity through quality checks.
- Collaborate with data scientists & engineers to deliver solutions.
Qualifications :
- 7+ Years in Data Engineering.
- Expertise in AWS services, SQL, Python, Spark, Kafka.
- Experience with CI/CD, DevOps practices.
- Strong problem-solving skills.
Preferred Skills :
- Experience with Snowflake, Databricks.
- Knowledge of BI tools (Tableau, Power BI).
- Healthcare/Insurance domain experience is a plus.


Job Title : Full Stack Developer (Python + React.js)
Location : Gurgaon (Work From Office, 6 days a week)
Experience : 3+ Years
Job Overview :
We are looking for a skilled Full Stack Developer proficient in Python (Django) and React.js to develop scalable web applications. The ideal candidate must have experience in backend and frontend development, database management, and cloud technologies.
Mandatory Skills :
✅ Python, Django (Backend Development)
✅ PostgreSQL (Database Management)
✅ AWS (Cloud Services)
✅ RabbitMQ, Redis, Kafka, Celery (Messaging & Asynchronous Processing)
✅ React.js (Frontend Development)
Key Requirements :
- 3+ Years of experience in Full Stack Development.
- Strong expertise in RESTful APIs & Microservices.
- Experience with CI/CD, Git, and Agile methodologies.
- Strong problem-solving and communication skills.


Job Title : Python Django Developer
Location : Gurgaon (On-site)
Work Mode : 6 Days a Week (Work from Office)
Experience Level : 3+ Years
About the Role :
We are seeking a highly skilled and motivated Python Django Developer to join our team in Gurgaon. This role requires a hands-on developer with expertise in building scalable web applications and APIs using Python and Django. The ideal candidate will have a strong background in relational databases, message brokers, and distributed systems.
Key Responsibilities :
- Design, develop, and maintain robust, scalable, and secure web applications using Python and Django.
- Build and optimize back-end services, RESTful APIs, and integrations with third-party tools.
- Implement and maintain asynchronous task processing using Celery and RabbitMQ.
- Work with PostgreSQL to design and optimize database schemas and queries.
- Utilize Redis and Kafka for caching, data streaming, and other distributed system needs.
- Debug and troubleshoot issues across the application stack.
- Collaborate with cross-functional teams to gather requirements and deliver solutions.
- Ensure code quality through comprehensive testing, code reviews, and adherence to best practices.
Required Skills and Qualifications:
Technical Expertise:
- Proficiency in Python and strong experience with Django framework.
- Hands-on experience with PostgreSQL for database design and management.
- Familiarity with RabbitMQ, Celery, and Redis for asynchronous processing and caching.
- Experience with Kafka for building real-time data pipelines and event-driven architectures.
Other Skills:
- Strong understanding of software development best practices and design patterns.
- Proficiency in writing efficient, reusable, and testable code.
- Good knowledge of Linux/Unix environments.
- Familiarity with Docker and containerized deployments is a plus.
Soft Skills:
- Excellent problem-solving and analytical skills.
- Good communication and teamwork abilities.
- Ability to work independently and in a collaborative team environment.
Preferred Qualifications:
- Experience in microservices architecture.
- Exposure to DevOps tools and practices.
- Knowledge of front-end technologies like React or Angular is a bonus.
Dear Candidate,
We are urgently Hiring AWS Cloud Engineer for Bangalore Location.
Position: AWS Cloud Engineer
Location: Bangalore
Experience: 8-11 yrs
Skills: Aws Cloud
Salary: Best in Industry (20-25% Hike on the current ctc)
Note:
only Immediate to 15 days Joiners will be preferred.
Candidates from Tier 1 companies will only be shortlisted and selected
Candidates' NP more than 30 days will get rejected while screening.
Offer shoppers will be rejected.
Job description:
Description:
Title: AWS Cloud Engineer
Prefer BLR / HYD – else any location is fine
Work Mode: Hybrid – based on HR rule (currently 1 day per month)
Shift Timings 24 x 7 (Work in shifts on rotational basis)
Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.
Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting
Experience and Skills Requirements:
Experience:
8 years of experience in a technical role working with AWS
Mandatory
Technical troubleshooting and problem solving
AWS management of large-scale IaaS PaaS solutions
Cloud networking and security fundamentals
Experience using containerization in AWS
Working Data warehouse knowledge Redshift and Snowflake preferred
Working with IaC – Terraform and Cloud Formation
Working understanding of scripting languages including Python and Shell
Collaboration and communication skills
Highly adaptable to changes in a technical environment
Optional
Experience using monitoring and observer ability toolsets inc. Splunk, Datadog
Experience using Github Actions
Experience using AWS RDS/SQL based solutions
Experience working with streaming technologies inc. Kafka, Apache Flink
Experience working with a ETL environments
Experience working with a confluent cloud platform
Certifications:
Minimum
AWS Certified SysOps Administrator – Associate
AWS Certified DevOps Engineer - Professional
Preferred
AWS Certified Solutions Architect – Associate
Responsibilities:
Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.
The following is a list of expected responsibilities:
To manage and support a customer’s AWS platform
To be technical hands on
Provide Incident and Problem management on the AWS IaaS and PaaS Platform
Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner
Actively monitor an AWS platform for technical issues
To be involved in the resolution of technical incidents tickets
Assist in the root cause analysis of incidents
Assist with improving efficiency and processes within the team
Examining traces and logs
Working with third party suppliers and AWS to jointly resolve incidents
Good to have:
Confluent Cloud
Snowflake
Best Regards,
Minakshi Soni
Executive - Talent Acquisition (L2)
Rigel Networks
Worldwide Locations: USA | HK | IN

Job Title : Python Django Developer
Location : Gurgaon (On-site)
Work Mode : 6 Days a Week (Work from Office)
Experience Level : 3+ Years
About the Role :
We are seeking a highly skilled and motivated Python Django Developer to join our team in Gurgaon. This role requires a hands-on developer with expertise in building scalable web applications and APIs using Python and Django. The ideal candidate will have a strong background in relational databases, message brokers, and distributed systems.
Key Responsibilities :
- Design, develop, and maintain robust, scalable, and secure web applications using Python and Django.
- Build and optimize back-end services, RESTful APIs, and integrations with third-party tools.
- Implement and maintain asynchronous task processing using Celery and RabbitMQ.
- Work with PostgreSQL to design and optimize database schemas and queries.
- Utilize Redis and Kafka for caching, data streaming, and other distributed system needs.
- Debug and troubleshoot issues across the application stack.
- Collaborate with cross-functional teams to gather requirements and deliver solutions.
- Ensure code quality through comprehensive testing, code reviews, and adherence to best practices.
Required Skills and Qualifications:
Technical Expertise:
- Proficiency in Python and strong experience with Django framework.
- Hands-on experience with PostgreSQL for database design and management.
- Familiarity with RabbitMQ, Celery, and Redis for asynchronous processing and caching.
- Experience with Kafka for building real-time data pipelines and event-driven architectures.
Other Skills:
- Strong understanding of software development best practices and design patterns.
- Proficiency in writing efficient, reusable, and testable code.
- Good knowledge of Linux/Unix environments.
- Familiarity with Docker and containerized deployments is a plus.
Soft Skills:
- Excellent problem-solving and analytical skills.
- Good communication and teamwork abilities.
- Ability to work independently and in a collaborative team environment.
Preferred Qualifications:
- Experience in microservices architecture.
- Exposure to DevOps tools and practices.
- Knowledge of front-end technologies like React or Angular is a bonus.
Java Technical Lead
We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team.
A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Experience: 10+ years
Location: Mumbai
Job Description:
• Experience in Core Java, Spring Boot.
• Experience in microservices.
• Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
• Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
• Good knowledge of multi-threading
• Basic working knowledge of Unix/Linux
• Excellent problem solving and coding skills in Java
• Strong interpersonal, communication and analytical skills.
• Should be able to express their design ideas and thoughts.
About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- SFDC(Data modelling experience) would be given preference
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn

Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Scala
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- AWS preferable/Any cloud
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices